Artist demonstrates emotion-detecting videos may help people see their own biases
At the Cooper Hewitt Smithsonian Design Museum in New York, the “Interactive Installation,” Perception IO intends to force users to realize their racial bias, says its creator, artist and international technology speaker, Karen Palmer, who describes herself as the “Storyteller from the Future.”
“This is an interactive installation,” and “(i)ts medium is video monitors, computer, eye-tracking hardware, and software, [and] additional software” that explores “the intersection of AI, emotion detection, eye tracking, and bias,” Cooper Hewitt describes the exhibit.” It emphasizes that Palmer’s “immersive storytelling experience by … reveals how your gaze and emotions influence your perception of reality. Perception IO (input-output) is a prototype and ongoing work-in-progress. This reality simulator invites you to evaluate your perceptions, become aware of your subconscious behavior, and reprogram it.”
Perception IO, Palmer told WIRED, is a test terrain for a film in which she said: “I’m going to put you in an environment where people are going to be demonstrating against artificial intelligence and saying that it is now a human rights issue.”
It is all, ultimately, it seems, part of Palmer’s Democratise AI project, Consensus Gentium, meaning, “(i)f everybody believes it, it must be true.” The film intends to pit streets full of protesters with riot police and, presumably, from how shes explained it, the gamut of surveillance detection technologies with which she and her cohorts are experimenting.
“Where your eyes go is going to reveal to you what, we feel, is your perception of the situation,” she told WIRED.
Pretty heady stuff for sure. But as Andrew McWilliams, a director at ThoughtWorks Arts, where Palmer was a resident artist in 2017, told WIRED, “(t)his was an opportunity for us to try and expose [facial detection technology].”
As the Cooper Hewitt gallery says of Palmer’s exhibit, “(y)ou will take the position of a police officer watching a training video of a volatile situation. How you respond will have consequences for the characters. The system will track your eye movements and facial expressions. Analysis of your gaze and your expressions will be revealed, and you will be able to examine your own implicit biases. How comfortable are you with the idea that your perceptions of reality have real-life consequences? Would you bet your life on it?”
Again, pretty heady stuff.
A “research-based artist,” Palmer rightfully points out we all live in the post-industrial Information Age, “which,” Cooper Hewitt says, “has divided society and will soon move into the Age of Perception, a period of greater understanding. Perception IO will enable you to see how human bias creates biased networks and to understand the need for regulating the use of artificial intelligence.”
The exhibition began on September 20 and will be on display through May 25. There were 19 objects in this exhibition, but currently, only six are available for viewing and interacting. The gallery explained that some “may not be viewable because they were on loan; this might be due to issues involving image rights or simply because there is no digitized image for the objects.”
The Cooper Hewitt gallery says that, “(g)overnments and corporations collect images from social media and public places, using this data to monitor identity and classify emotions. Personal photos and videos stockpiled without permission. Flawed algorithms amplify bias. Faulty law enforcement tools can trigger harassment and false arrests. Measured and monetized, our faces have become valuable data, sold and circulated without public oversight.”
The museum says the installation design “features a canopy of abstract, synthetic reeds, suggesting an uneasy marriage of nature and technology.”
Perception IO, which WIRED dubbed “the future of AI and unconscious bias” can “not only distinguish facial expressions but also single out four emotions – anger, fear, surprise, and calmness.”
Palmer explained on her website, “(t)he participant takes the POV of the body cam of a police officer. You will view a hostile situation, coming into contact with both a black protagonist and a white protagonist. How you respond to the scene will have consequences for the characters (in the form of a branching narrative) influenced by your emotional response. Will your implicit bias betray you?”
“How you respond will have consequences for the characters,” Palmer said.
As part of her effort to bring awareness to privacy, civil, and human rights issues of AI, her “team” developed EmoPy, the source code for a “deep neural net toolkit for emotion analysis via Facial Expression Recognition (FER), which is freely available.
“Palmer’s mission doesn’t stop here,” WIRED reported. “Voice detection could recognize emotions based on the audio frequency and pitch of a participant’s voice,” and “Palmer is … taking inspiration from VR multiplayer gaming and is hoping to make the experience even more immersive.”
More than just a storyteller from the future, Palmer is also an engaging international speaker on new tech and her vision for the future media, which she says she “sees is gaming, film, consciousness, and technology.” She spoke at the Google Cultural Institute in Paris about her work and has been a TEDx Australia Speaker at the Sydney Opera House on the future of gaming, film, and technology titled, “Your Brain is a Remote Control.”
Article Topics
AI | artificial intelligence | biometric-bias | biometrics | emotion recognition | eye tracking
Comments