Technology September 2022
The Privacy Implications of Intel's Classrom AI
A look at the interpreting biometric data
Written by Peggy Keene
Intel has developed software that claims to detect the emotional state of children by interpreting their body language via web cameras. The company created the new software in response to the shift to remote learning brought about by the coronavirus pandemic. From kindergarten to graduate schools, Zoom and other online platforms have become the new normal when it comes to serving as classrooms for students. Intel’s new software, however, is already prompting questions and concerns.
Intel’s AI for use on students raises questions on biometric
information privacy
Intel’s technology raises new privacy concerns as it is unclear what
biometric information Intel will store and analyze. Biometric
information is already governed differently state to state, and Intel’s
software enters unchartered territory. According to Intel, the software
will use artificial intelligence to analyze the body language and faces
of students online. Then, based on this analysis, the artificial
intelligence concludes the “emotional state” of the student and passes
that on to the teacher in real-time. Intel claims that this will allow
teachers to recognize when students are confused, bored, or in need of
attention. Accordingly, Intel notes the technology’s main objective is
to improve one-on-one interactions between the student and teacher as
opposed to entire classrooms. The technology is expected to be
integrated into Zoom, and if successful, Intel intends to carry the
technology into other videoconferencing processes.
Translating biometric information
With Intel’s software apparently having the ability to interpret the
biometric data, it is unclear what standards Intel would use to conclude
students’ mental states. Studies have demonstrated that human
expressions vary by person, and cultural backgrounds of students heavily
influence a student’s general demeanor in the classroom. Intel’s
accuracy when it comes to interpreting facial expressions is unknown,
and it is unclear what the technology actually takes into account. In
response to these criticisms, Intel has stated that its software was
created in conjunction with a team of psychologists, and an emotion had
to be validated by two out of three psychologists before making the cut.
On the other hand, Intel admitted that the final assessment of the
software was based on its utility for teachers as opposed to its
accuracy in judging students’ emotions.1
Privacy advocates question Intel’s collection of student
biometric information
As such, it is unclear what Intel will do with the storage and
collection of such biometric information. If Intel concludes that a
student is “distracted” or “bored,” will such information be stored on a
server somewhere? Will this conclusion be appealable, or will it be
forever recorded in a student’s permanent record? Will the information
be categorized as personal information, and as such, be protected under
applicable privacy laws? As Intel’s software raises new questions about
a student’s privacy and sensitive information, privacy attorneys and
advocates would do well to follow the application of such technology as
it forms and is integrated into student classrooms.
Key takeaways on
Intel’s AI to interpret
student emotional states
Intel is working on artificial intelligence intended to analyze the
emotional state of students through the collection of biometric data.
This technology raises new issues in law and biometric privacy such
as:
-
To what standards is Intel comparing students’ expressions
-
How will Intel store such biometric information?; and
-
Will the conclusions about a student’s emotional state be considered sensitive information? TBJ
This article, which was originally published on the Klemchuk Ideate Blog, has been edited and reprinted with permission.
Peggy Keene is of counsel to Klemchuk in Plano. Her practice focuses on intellectual property and internet law, e-commerce, and data privacy. Keene has also served as in-house counsel in the telecommunications industry.