FB pixel

GAO warns of privacy risks in using facial recognition in rental housing

GAO warns of privacy risks in using facial recognition in rental housing
 

As landlords and housing authorities are adopting facial recognition systems marketed as cutting-edge tools to enhance safety and convenience, a new audit by the Government Accountability Office (GAO) found the technology is racing ahead of federal oversight, raising pressing concerns about privacy, accuracy, and tenant rights.

GAO’s audit report marks one of the most comprehensive federal examinations yet of property technology, or “Proptech,” in the rental housing market. While it examined advertising platforms, tenant screening software, and rent-setting algorithms, GAO’s most striking findings center on facial recognition.

GAO auditors concluded that while the systems may reduce trespassing and crime, they also pose risks of discriminatory errors and unchecked surveillance. They faulted the Department of Housing and Urban Development (HUD) for offering only vague guidance to the nation’s more than 3,300 public housing agencies, leaving officials uncertain about what rules apply.

GAO found that property owners and housing agencies see real security benefits in facial recognition. All ten public housing agencies (PHAs) interviewed by GAO said the systems had improved safety by restricting entry to residents and authorized guests. By replacing key fobs or access codes with facial scans, the technology made it harder for outsiders to slip inside subsidized housing complexes.

Industry associations echoed that view, noting the systems could reduce burglaries, drug activity, or unauthorized subletting. The promise of seamless access has made the systems appealing to managers looking to modernize operations and cut costs.

However, GAO emphasized that the very same systems present troubling risks. Civil liberties groups warned that biometric data collection in housing environments carries an inherent danger, saying that once stored, facial images can be misused, shared with law enforcement without consent, or retained long after a tenant has moved out.

Prior GAO audits has documented that commercial facial recognition systems misidentify women and people of color at higher rates than white men. In rental housing, such errors could translate into locked doors, repeated denials of entry, or wrongful suspicion of tenants and guests.

The audit found that public housing agencies are struggling with fundamental questions. Six of the ten agencies GAO interviewed said they did not know how to properly obtain renter consent. Five asked for clarity on how long to store images after a tenant leaves, and one sought guidance on how to address accuracy issues.

In September 2023, HUD issued a brief letter advising PHAs to “balance security with privacy” when deploying surveillance technology. But GAO criticized that communication as insufficient. It provided no concrete direction on consent, retention, accuracy, or data sharing. Nor did it establish uniform standards to prevent potential abuses.

HUD officials told GAO they had no plans to update the letter, citing limited resources and a desire to preserve local autonomy. Developing comprehensive guidance would require surveying thousands of agencies, they argued.

GAO countered explaining that the risks are well documented and could be addressed without such an effort. Federal internal control standards, GAO noted, require agencies to communicate “necessary quality information” to achieve program objectives. HUD, GAO concluded, had fallen short.

The concerns extend beyond technical glitches. Advocacy groups told GAO they fear facial recognition could become another vector for housing discrimination. A system that disproportionately misidentifies black women, for instance, could subject them to repeated humiliations and reduced access to their own homes. Such disparities echo broader worries about algorithmic bias in tenant screening and rent-setting software.

Civil rights lawyers also caution that surveillance data could be weaponized against vulnerable tenants. If shared with police, immigration authorities, or private investigators, facial recognition records could expose residents to risks far beyond their housing complex. Without clear rules on consent and limits on secondary uses, renters may have little recourse.

These issues intersect with federal fair housing obligations. HUD enforces the Fair Housing Act, which prohibits discrimination in rental housing on the basis of race, color, sex, national origin, religion, disability, and familial status. GAO suggested that without stronger oversight, PHAs risk violating those protections if facial recognition systems exacerbate disparate impacts or enable discriminatory practices.

GAO’s audit highlighted that other federal agencies have taken meaningful steps to regulate Proptech. The Federal Trade Commission and Department of Justice pursued enforcement against deceptive advertising platforms. The Consumer Financial Protection Bureau has acted against inaccurate tenant screening reports. HUD itself has issued guidance on discriminatory online housing ads and, until recently, on tenant screening tools.

Yet, on facial recognition in housing, the federal government has largely stayed silent. No federal agency has issued comprehensive rules governing how landlords or PHAs can deploy the technology. HUD’s single letter remains the only directive, and its lack of specificity leaves PHAs guessing.

This regulatory gap contrasts with rising adoption. Facial recognition is rapidly entering the rental housing market. Cameras at entrances, lobbies, and hallways now feed into AI systems that match faces against stored databases. As vendors court both luxury apartment operators and public housing agencies, the pressure grows for policymakers to step in.

GAO’s central recommendation was straightforward: HUD should provide more specific written directions to public housing agencies on the use of facial recognition. That guidance, auditors said, should spell out permissible uses, define what counts as valid consent, and address data management and accuracy concerns.

By clarifying the rules, GAO argued that HUD could help PHAs avoid violating tenant privacy and civil rights. Without such direction, the risk grows that agencies will implement systems inconsistently, leaving tenants exposed to errors, intrusive surveillance, and potential discrimination.

This vacuum leaves Congress and HUD at a crossroads. Should facial recognition in public housing be embraced as a tool for safety, restricted due to privacy risks, or regulated with strict safeguards? GAO’s message is that inaction is not an option.

GAO warned that the risks of facial recognition are too significant to leave to ad hoc decisions by local agencies. “By providing additional direction on use of facial recognition technology,” GAO stated, adding, “HUD could help PHAs it oversees mitigate privacy and accuracy concerns and offer clarity on key issues such as purpose, consent, and data management.”

As Proptech reshapes the rental market, GAO’s audit underscores that housing is not just another domain for innovation. It is where millions live their daily lives. And in that context, the balance between safety and civil liberties must be carefully drawn.

Related Posts

Article Topics

 |   |   |   |   |   |   | 

Latest Biometrics News

 

STCon edges in among established facial recognition accuracy leaders in NIST 1:N

A handful of new facial recognition algorithms have been added to the NIST FRTE 1:N Identification this year, but most…

 

EC’s use case manual explains age verification with EUDI Wallet 

The European Commission has published an age verification Use Case Manual, showcasing how citizens will be able to prove they…

 

UK Fraud Strategy considers business digital identity and IDV 

In a new fraud strategy, the UK government is showing its commitment in fighting fraud and the way it has…

 

Turks and Caicos President unveils major digital transformation agenda

Prime Minister Charles Washington Misick of Turks and Caicos Islands says the government is undertaking a major digital transformation project…

 

Deepfakes force enterprises to rethink cybersecurity

Organizations must move beyond simple detection tools to defend against AI-generated impersonations and synthetic media attacks. As generative AI continues…

 

Are deepfakes already in your system? Learn how to find out, what to do

The threat that deepfakes pose to biometric onboarding processes is well known at this point. Less well understood, according to…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events