FB pixel

Tool pushes AI community’s nose in algorithms’ mess

Categories Biometric R&D  |  Biometrics News
Tool pushes AI community’s nose in algorithms’ mess
 

An AI researcher has created a tool that confronts viewers with the bias built into artificial intelligence algorithms.

Sasha Luccioni created her tool for Hugging Face, a self-described AI community “on a mission to democratize good machine learning.” Luccioni chose Stability AI’s Stable Diffusion to check for bias.

Visitors are invited to type words and short, descriptive phrases that Stable Diffusion illustrates. Specifically, visitors are invited to submit information that, if the algorithm were not biased, would present reality.

Instead, it builds images, four at a time, that better represent the internet’s constructed reality – sexist, racist, ageist and classist results. (Stable Diffusion also has an unhealthy and unrealistic obsession with fingers sprouting in great numbers from palms.)

It is not like the seedier corners of the Web. The algorithm does not only show childish and hateful images. It is just that some demographics are underrepresented.

And results are uneven.

“Surgeon” results in many images of women, if almost all white in appearance. “Nurse” brings mostly female depictions. One of them was of a male doctor talking to a female nurse who appears to be crying.

For better or worse, drug abusers are white or likely white in Stable Diffusion’s imagination.

Luccioni probably has seen a thing or two in this regard. She is a postdoctoral researcher with the Université de Montréal and has been an AI research scientist for Nuance Communications and Morgan Stanley in a career dating back to 2017.

In an interview with tech news and culture publication Gizmodo, Luccioni considered putting OpenAI‘s DALL-E2 to the test, but Stable Diffusion is a more open and “less regulated platform.”

The article makes it clear that OpenAI has spoken openly about bias in DALL-E2.

Luccioni has created other bias tools, including one that scores submitted algorithms. It would be beneficial for everyone if that tool could enjoy the cultural splash that the AI algorithms are creating.

If everyone was focused on using tools to make these algorithms less biased, the developers would have less guesswork, which means the code is created more efficiently and AI fence sitters would feel more comfortable getting involved and there would be more accountability within the community. General people could feel better about AI, and lawmakers would likely feel braver in regulating use instead of banning it.

Related Posts

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

 

Meta challenges UK Online Safety Act fines tied to global revenue

Lo and behold: Meta does not want to pay the fines UK regulator Ofcom says are owed to it for…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events