Kids Code bills prompt epic showdown between regulators, activists and big tech firms
The latest craze sweeping the United States – legislation to protect kids’ data and overall online safety – has its own snappy epithet. The Guardian reports on the so-called “Kids Code” bills popping up in multiple state legislatures, the latest of which recently passed in Maryland by unanimous vote. The full list of nine states reads like a fellowship of age-appropriate design: Maryland, plus Vermont, Minnesota, Hawaii, Illinois, New Mexico, South Carolina, New Mexico and Nevada.
But every fellowship has its Nazgûl, and in this case the two sides warring for moral control of the internet involve some atypical partnerships. Social media companies are pushing back against the legal wave alongside porn distributors and civil rights advocates, who say age verification rules risk violating the constitutional rights of law-abiding adults. For the social media firms, however, it may be less a matter of ethics and more about not wanting to enforce age policies that would limit their massive user bases – all of which have been established under relatively lax verification standards.
Making matters even more complicated is the assertion by critics that social media platforms should not have to verify users’ age with ID or biometric verification because they already know a user’s age, as proven by targeted advertising. The argument is summed up tidily by a representative from the Tech Oversight Project, quoted in the Guardian: “Social media companies’ business models are based on knowing who their users are.”
Age verification laws repeat past mistakes, says ACLU
In a legal petition to the Supreme Court concerning what it alleges is an “unconstitutional age-verification provision in Texas’s HB 1181,” Vera Eidelman, staff attorney with the ACLU Speech, Privacy and Technology Project, argues that the legislative panic over kids accessing content adult content is an overreaction with historical precedent.
“This isn’t the first time that concerns about minors’ access have led legislators to pass unconstitutional laws,” reads the statement from Eidelman. “We’ve gone through this time and again, with everything from drive-in movies to video games to websites, and courts have repeatedly struck down laws imposing requirements that burden adults’ access to non-obscene sexual content in the name of protecting children.”
Regulators play David against big tech goliath NetChoice
Tactics employed by social media firms have not done much to dial down the tone. Lobbyists have posed as concerned parents in court without disclosing their affiliations. State disclosure forms reveal that big tech companies spent more than $243,000 in lobbying fees in Maryland in 2023, with Google spending $93,076, Amazon $88,886 and Apple $133,449. NetChoice, the industry lobby group representing the firms, has its own set of proposed solutions that would eliminate the need for identity verification, most of which take the onus to protect kids off of them. The Tech Oversight Project has observed “a clear and accelerating pattern of deception in anti-Kids Code lobbying.”
It makes for what John Carr, Secretary of the UK’s Children’s Charities’ Coalition on Internet Safety (CHIS) and a noted authority on young people’s use of the internet, calls “an exceptionally uneven playing field.”
Speaking on the fifth and final day of the 2024 Global Age Assurance Standards Summit, Carr says NetChoice frames its mission as “to defend free enterprise and free speech on the internet.” Carr disagrees. “The only thing that NetChoice actually does is take to court every single federal, state or city piece of legislation that tries to introduce any kind of regulation on anything connected with the Internet. And the reason for that is very straightforward – they don’t mind if they lose. But if they delay the process by five years, four years, six years, and the status quo is maintained, that’s money in the bank.”
CNIL offers an alternative to third-party solutions that come with data risk
While third-party vendors are the simplest and most accessible solution for age verification, they come with privacy risks – a problem that France’s National Commission on Informatics and Liberty (CNIL) aims to solve with its more private system.
In an interview with Scientific American, Olivier Blazy, a computer scientist and professor at the École Polytechnique in France who worked on CNIL’s age verification scheme, says the regulator’s system creates something like a firewall between the content provider and the verification service. “The only information the content provider gets is a yes or no about whether a user is aged 18 or older,” Blazy says. “The only information the age verifier gets is that someone has sent an age-verification request.”
Article Topics
ACLU | age verification | biometrics | children | CNIL | legislation | regulation | social media | United States
Comments