Company’s ‘enemies list’ enforced with facial recognition prompts legal challenge, debate
Madison Square Garden’s use of facial recognition to enforce an ‘enemies list’ has sparked a legal battle and growing scrutiny of the deployment, and the technology itself.
Attorneys barred from MSG and other venues owned by the same parent company are arguing that a law against theatres refusing entry to critics also precludes the ban against them, as explained in a New York Times podcast.
An early victory for the lawyers prompted an immediate appeal, but is not the only avenue for the practice to be challenged in court.
New York State Representative Tony Simone says he was “disinvited” from an event by MSG as well, after criticizing the hospitality group’s use of face biometrics. Now a group of state and city lawmakers have called for MSG’s liquor license to be revoked, Hell’s Kitchen-based outlet W42ST.nyc reports, though Simone told a morning show on local radio outlet WOR that the company’s owner should just behave like an adult.
The lawmakers also noted in a letter to Madison Square Garden Entertainment Corp. CEO and Executive Chairman James Dolan that he could be putting a $43 million tax abatement at risk, and that the deployment could harm free speech and non-discrimination rights.
In comments shortly after the blacklist of the lawyers became public, IBIA MD Robert Tappan noted that the deployment departs from the technology’s intended use, and flies in the face of industry efforts to encourage its responsible use.
Oberly weighs in
Biometric Update reached out to Squire Patton Boggs Senior Associate Attorney David J. Oberly, who specializes in biometric data privacy issues, to get some insight into the legal landscape, and how it may change in the wake of the controversy generated by MSG.
His response is reprinted here in full:
“The use of facial recognition by MSG Entertainment for surveillance purposes clearly illustrates the divide between, on one end of the spectrum, uses of this technology for surveillance or identification purposes and, on the other end of the spectrum, uses relating to the authentication or verification of individuals’ identities,” Oberly writes.
“Consumers have recently become more comfortable with, and accepting of, the use of facial recognition for authentication or verification purposes. In particular, the use of this technology as a method of access control has grown significantly in popularity, which can be attributed to the range of benefits that biometrics have to offer as compared to more traditional access control methods, such as passwords or badge swipe cards. Conversely, consumers continue to possess significant concerns regarding the privacy implications of facial recognition as it relates to identifying individuals without consent, especially as it relates to real-time public surveillance.
“At the same time that facial recognition has become a more ubiquitous part of daily life, lawmakers have also sharpened their focus on enacting regulation to govern its use and, more specifically, to thwart potential improper uses of the technology.
“In late 2020, Portland, Oregon became the first U.S. jurisdiction to enact a blanket ban over the use of facial recognition by the private sector. Of note, the Portland ordinance’s prefatory materials make clear that in enacting the ordinance, lawmakers were primarily concerned with the use of facial recognition for surveillance purposes within physical spaces and its corresponding potential risks for misidentification and misuse.
“In addition, New York City has also enacted a municipal-level ordinance regulating the use of biometrics-powered technologies by ‘commercial establishments,’ including retail stores, places of entertainment, and food and drink venues. While the New York City ordinance governs the use of all types of biometric technologies, it is evident that legislators in the Big Apple also had facial biometrics in mind as the primary focus of this regulation and, more specifically, concerns regarding technological limitations and the improper identification of individuals when used for surveillance purposes, as well as privacy-related issues, such as the fact that individuals cannot reasonably prevent themselves from surveillance and being identified by cameras that could be placed anywhere.
“At the federal level, the Federal Trade Commission (FTC) has also made policing facial recognition a priority focus for the nation’s de facto privacy regulator. In early 2021, the FTC settled its first enforcement action specifically targeting improper facial recognition practices with now-defunct photo developer Everalbum, Inc. More recently, in August 2022 the FTC reemphasized the priority focus it has placed on policing facial recognition with the issuance of its Advanced Notice of Proposed Rulemaking (ANPR) on ‘commercial surveillance’ practices, a large portion of which focuses on issues relating to facial recognition and whether the FTC should promulgate new rules to regulate or otherwise limit the use of this advanced technology. In particular, the FTC has indicated that it is considering whether to limit commercial surveillance practices that use or facilitate the use of facial recognition and, if so, how regulation of this nature should be fashioned and implemented.
“With that said — despite the sharp uptick in regulation geared towards facial biometrics — only Portland’s private sector facial recognition ban would have applied to bar MSG Entertainment’s facial biometrics surveillance practices. Moreover, it appears that MSG Entertainment was compliant with New York City’s commercial establishments ordinance when it used its facial recognition system to target and remove certain individuals on its “exclusion list” from Madison Square Garden and Radio City Music Hall, as the organization maintained a public notice — required by the ordinance — informing individuals that: ‘To ensure the safety of everyone in our venue, Radio City Music Hall employs a variety of security measures; including Facial Recognition which uses Biometric Identifier Information.’
“Taken together, the recent developments that have come to light regarding MSG Entertainment and its use of facial recognition surveillance tactics — which demonstrate how the current patchwork of biometric privacy statutes and ordinances fail to curb these kinds of tactics — may very well lead to an increase in regulation over the use of this technology in 2023. This is especially so given the substantial amount of negative publicity that MSG’s controversial use of facial biometrics has garnered, as recent sustained negative news coverage detailing the company’s surveillance practices will put significant pressure on lawmakers to make greater facial recognition regulation a reality sooner than later.
“In particular, the events involving MSG Entertainment illustrate the shortcomings and lack of teeth underlying more general biometric privacy regulation, especially as it relates to curtailing improper facial recognition surveillance practices. This may encourage lawmakers to shift their focus from attempting to enact BIPA-like laws to pursuing outright bans on certain uses of facial biometrics, similar to Portland’s blanket ban on facial recognition. Already in 2023, New York has introduced legislation (AB 322) that seeks to impose a blanket ban on the use of facial biometrics by landlords that operate in the state. It is reasonable to posit that legislatures in other parts of the country may introduce their own legislation geared toward outright bans on surveillance tactics during the 2023 legislative cycle.
“At the same time, Maryland (HB 33), Mississippi (HB 467), and New York (AB 1362) have all introduced broad biometric privacy bills, similar to BIPA, CUBI, and Washington’s HB 1493 biometric privacy statue, that regulate the collection and use of all forms of biometric data. As a result of the MSG surveillance incidents that have recently come to light, it is now more likely that these states may find success in pushing their BIPA-like bills past the finish line and into law. At the same time, these developments may also provide strong encouragement to lawmakers contemplating the prospect of enacting robust regulation over the use of this technology in other parts of the country — but who have not yet introduced legislation and who lack an appetite for passing an outright ban — to push forward with strict regulation paralleling that of Illinois’s biometric privacy statute.
“Taken together, with the increased likelihood that other jurisdictions will enact targeted facial recognition laws of their own in 2023, it is imperative that companies utilizing this technology devote the necessary time, effort, and resources not only to comply with the laws that are currently on the books, but also the additional laws that are anticipated to be enacted in the near future.”
best practices | biometrics | data privacy | facial recognition | New York City | video surveillance