January 31, 2017 -
A New York federal judge rejected a proposed class action lawsuit against Take-Two Interactive Software that asserted that the video game developer was collecting facial biometric data from users, according to a report by The Hollywood Reporter.
The lawsuit involved the company’s “MyPlayer” avatar feature of its NBA 2K15 and NBA 2K16 games which allow users to create personalized digital avatars through face-scanning technology.
The feature uses cameras connected to PS4 and Xbox game systems to scan the user’s face and head.
In order to complete this process, users are required to agree to terms and conditions informing that the face scan will be visible to others. The company allegedly stores the biometric data for an indefinite period of time on its servers.
The lawsuit stated that the plaintiffs experienced injury relating to concerns of privacy and engaging in future biometric-facilitated transactions, and claims that Take-Two violated the Illinois Biometric Information Privacy Act.
In addition, the suit alleged that Take-Two failed to obtain the plaintiffs’ informed consent, and that the plaintiffs did not fully understand Take-Two’s practices relating to biometric data, such as how the facial scans would be retained and disseminated.
U.S. District Judge John Koetl dismissed the plaintiffs’ complaint with prejudice, stating that they failed to establish an imminent risk of harm from the game developer’s storage and circulation of the biometrics data.
As part of his ruling, Judge Koetl said that while there is a potential risk with biometric identifiers like face scans since they cannot be changed, the concerns of the plaintiffs are “highly speculative and abstract.”
“At best, more extensive notice and consent could have dissuaded the plaintiffs from using the MyPlayer feature, meaning that Take-Two would have never collected the plaintiffs’ biometrics,” Judge Koetl wrote. “But the plaintiffs have failed to establish that their use of the MyPlayer feature resulted in any imminent risk that the data protection goal of the BIPA would be frustrated. Consequently, more extensive notice and consent could not have altered the standing equation because there has been no material risk of harm to a concrete BIPA interest that more extensive notice and consent would have avoided.”
Previously reported, Facebook has argued that the Illinois biometrics law that prevents interstate-sharing of facial recognition data violates the U.S. Constitution.