FB pixel

Bill introduced in U.S. Congress to regulate machine learning algorithms

 

U.S. congressional democrats have introduced a bill in both houses to force large companies to audit machine learning algorithms, such as for facial biometrics, for bias, The Verge reports.

The Algorithmic Accountability Act (PDF) is being sponsored in the Senate by Cory Brooker (D-NJ) and Ron Wyden (D-OR), and in the House by Representative Yvette Clarke (D-NY). It gives the Federal Trade Commission a mandate to create rules for evaluating automated systems deemed “highly sensitive,” including their training data, and require companies to perform a self-assessment based on the criteria. The company is then expected to take corrective action if an algorithm is found to pose a risk of discrimination or privacy loss.

The bill only applies to data brokers and related businesses and to companies that meet thresholds of $50 million in earnings, or possession of data on either a million customers or customer devices.

The announcement notes that Facebook was charged with violations of the Fair Housing Act by the Department of Housing and Urban Development earlier this month, and refers to reports of an automated recruiting tool that Amazon shut down after discovering it was biased against women. Algorithmic accountability has been a hot topic in the biometrics industry over the past year, with the Algorithmic Justice League launching a Safe Face Pledge campaign in late 2018, and Senator Kamala Harris calling on regulators to specifically consider facial recognition in examining AI bias last September.

Since then, the industry has increased its focus on fairness and accountability in conferences, and a bi-partisan effort has been launched to regulate facial biometrics.

The Center of Privacy and Technology at Georgetown Law, which has been critical of the use of facial recognition for CBP’s Biometric Exit program, has endorsed the bill, along with Data for Black Lives and the National Hispanic Media Coalition.

“By requiring large companies to not turn a blind eye towards unintended impacts of their automated systems, the Algorithmic Accountability Act ensures 21st Century technologies are tools of empowerment, rather than marginalization, while also bolstering the security and privacy of all consumers,” Clarke says in a statement.

New York City passed an algorithmic transparency bill in 2017, and Washington State has considered a similar measure this year, according to The Verge.

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

Oxford program to study DPI impact on social, financial inclusion

Oxford University’s Blavatnik School of Government has announced the establishment of the Oxford Digital Public Infrastructure Research Lab (OxDPI), an…

 

Idemia makes OEM pitch for biometric modules

A recent webinar from Idemia Public Security looks at how original equipment manufacturers (OEMs) can integrate seamless security into devices…

 

ICE wants biometric monitoring devices for alternative to detention program

US Immigration and Customs Enforcement (ICE) issued a Request for Information (RFI) for biometric monitoring devices as part of its…

 

Biometrics coming to more stadiums with facial recognition tender in NSW

Venues New South Wales (VNSW) has issued a tender for facial recognition systems to be deployed at Stadium Australia (Accor…

 

FinGo supplying vein biometrics to boost gold mining transparency

SMX – a company operating in the so-called circular economy – is collaborating with finger vein biometrics firm FinGo in…

 

Biometric privacy law in Texas close enough to BIPA to protect Match

Just because you live in Illinois and a company has processed your biometrics without getting your informed consent, you may…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events