Affective computing draws Intel’s attention, prompts debate
Intel and edtech startup Classroom Technologies have developed a tool for integration with Zoom to let teachers know if their students are learning well by analyzing their facial expressions with artificial intelligence.
The idea is to improve student engagement, which has been reduced by virtual classrooms during the pandemic, and is hard for teachers to judge even when in class, Protocol reports.
According to critics, however, accurate determinations about how bored or confused a person is are not possible from their facial expressions and similar cues. Furthermore, a student’s reaction, particularly in a home environment, may be caused by a factor other than the educational material.
Classroom Technologies Co-founder and CEO Michael Chasen also acknowledges the need to be sensitive to concerns around how intrusive technology can be in comments to Protocol. He also admits the technology is not yet “fully” mature.
Even whether to require students to use their webcams when in class is controversial, as the application is relatively resource-intensive, and can also reveal otherwise private information about people’s homes.
Some teachers participating in Intel testing gave positive assessments, and the technology is not yet in production.
Intel trained its algorithm on data labelled by experts it hired to review videos of students, applying labels agreed on by two out of three experts.
Sentiment analysis and emotion recognition
Concerns about emotion recognition, or ‘emotion AI,’ are leading to confusion about sentiment analysis, according to experts in the field interviewed for a separate article by Protocol.
The terms are often used interchangeably, as in a Fight for the Future campaign cited by Protocol and subsequently updated.
They are different, however, in that sentiment analysis is text-based and emotion recognition is based on facial analysis, according to Affectiva CEO and Co-founder Rana el Kaliouby. It could also be based on other biometrics, like gait.
Nazanin Andalibi, an assistant professor at the University of Michigan School of Information, argues that sentiment analysis is still looking for “affective phenomena,” or physical manifestations of interior states. This interpretation would make sentiment analysis a cousin of emotion AI if not a subset of it.
The article goes on to explore the implications of this kind of characterization for regulation of facial recognition and biometric data use.
With more major tech players investing in emotion recognition, like Zoom, the issue appears to be heating up rapidly.
Article Topics
accuracy | AI | biometrics | emotion recognition | face biometrics | facial analysis | Intel | privacy | sentiment detection | Zoom
Comments