CDT tears down straw man to oppose age verification
A straw man argument against age verification has been offered up by a coalition of lobby groups and academics to argue against the online pornography restrictions in Texas’ HB 1181. A companion bill containing social media restrictions has already been defeated in court.
Age assurance systems, from submission of scanned ID documents to biometric age estimation, are often inaccurate, easy to circumvent, inaccessible to certain groups, and introduce privacy and security risks, according to the Center for Democracy & Technology, along with the Open Technology Institute at think tank New America, The Internet Society and three university professors.
The amicus brief in Free Speech Coalition v. Paxton cites an ITIF analysis in suggesting that “there are many workarounds” to age verification based on ID documents. This method, along with temporary payment card charges, access to and analyses of data from third-party databases, biometric scanning and first-party signals analysis are all ineffective and detrimental to security and privacy, the brief argues.
The analysis refers to several ways that children might attempt to circumvent the block which can be mitigated or prevented with other technologies, such as referring to “holding up a photograph or image of another person’s face to the smartphone camera or webcam.” The technologies for preventing these work-arounds, like presentation attack detection in the selfie biometrics example, are standard issue for age verification solutions, but are mostly ignored in the lobbyists’ letter. The one mention of PAD in the brief refers to a study published in 2014 to make the argument that the technology does not work on children.
Given that most of the testing on age verification, liveness detection, and biometric accuracy available online was completed much more recently, the position appears disingenuous. Indeed, the brief refers to NIST research that yielded results showing that biometric age estimation is potentially as effective as checking a date of birth field on an ID document, which is how pornography access restrictions are currently carried out across the U.S., but studiously avoids noting that conclusion.
The passage also refers to anecdote from a paper which depicts an adult being estimated as an adult while holding a dog in front of his face. Statistics from the 2018 Gender Shades study are cited via a 2024 research paper, in another familiar obfuscation technique.
Comments from ConnectID Managing Director Andrew Black on discussion about social media restrictions in Australia and around the world are instructive here.
“It’s surprising to see that many are primarily concerned with the technical feasibility, imagining it to be too complex,” Black writes.
“In fact, the technology to enable age restrictions already exists and is relatively simple. The bigger question lies in how we apply it while safeguarding privacy and offering real choice.”
That question, however, has an answer: “We already have tools in place to solve this while protecting privacy,” according to Black.
Algorithm opt-out approved in California
California Governor Gavin Newsom has approved legislation to give parents the right to block social media platforms from presenting content to their children based on an algorithm, but vetoed a bill to force browsers to build in opt-out mechanisms for data sharing.
The Protecting Our Kids from Social Media Addiction Act, S.B. 976, allows parents to tell social media platforms to serve content to their children in chronological order.
A.B. 3048 would have required mobile operating system and web browser developers to build in a setting with which users could inform businesses that they are opting out of allowing their personal data to be sold and shared. The inclusion of mobile operating systems is the sticking point for Newsom, who notes that most browsers already include the feature, while operating systems do not. This raises a design question Newsom says is better left to developers than regulators.
Meanwhile at the federal level, the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act (COPPA 2.0) have each been approved by the House Committee on Energy and Commerce.
Each was modified, which could erode support as they head to a House vote, The Verge reports.
Article Topics
age verification | Australia | California | children | connectID | face biometrics | facial analysis | social media | Texas
Comments