Researcher continues to expose the public danger of intimate biometric profiling
Troubling public research to determine how accurate facial and voice biometrics-based systems can be in predicting people’s intimate traits is not going away. It is, in fact, slowly advancing.
And a private, even secretive, interest in this branch of AI goes back at least 13 years.
A Stanford University associate professor of organizational behavior, Michal Kosiński, in January 2021 published a study that shows an algorithm can accurately sort naturalistic facial photographs into the political categories of ‘liberal’ and ‘conservative’ 72 percent of the time.
Kosiński made headlines (often silly and/or sardonic) exactly three years prior as part of a team that used a popular, off-the-shelf algorithm to guess the sexual orientation of the people in facial photographs between 71 percent and 81 percent of the time.
He was interviewed this week for a marketing piece posted on the Stanford Graduate School of Business. In it, he defends his work and says that, in a sense, it is not yet finished.
Kosiński maintains that his research is a warning into the potential misuse of these technologies, not an invitation for business, law enforcement, divisive politicians and defense officials to invest. Indeed, he says, he has found related patent applications by startups as far back as 2008.
He is not alone, not that opposition has forced any change or even legislative debate.
It is likely that most people do not want unseen people scraping images of themselves off the internet to predict how likely it is that they sleep with a same-sex partner or have a particular politician’s sticker on their bumper.
And yet, Kosiński says, there remains more controversy about research showing how easy it is to accomplish these clearly privacy-invading goals than about who might be profiting from the work.
He uses the business school post to reiterate that it is urgent that the U.S. Congress look deeply into this topic and begin to regulate it.
It is hard to imagine a sector of the public, at least, that would want to be subjected to this kind of remote, invisible and personal investigation that could so easily be used to push forward someone else’s agenda.