US age assurance debate shifts in wake of ruling that social media causes harm

The push to put age-based restrictions on social media has played out in the regulatory arena, but is being driven by a groundswell of concern from parents, and a more complete understanding of how social platforms shape our behavior and where they belong in the cultural tapestry. That social media is harmful for kids is no longer a question of belief; in late March, a California court made it official, with the jury ruling that Meta and Google were liable for damage caused by their platforms, which are designed to be addictive.
In this light, it may become more difficult for Silicon Valley’s giants and their industry lobby, NetChoice, to continue stamping out age assurance laws like bushfires wherever they spring up across the U.S. Even established victories look tenuous, as the turning regulatory and cultural tides reorient the debate away from free speech and the privacy risks of biometrics toward the dangers represented by the platforms themselves.
In February, NetChoice celebrated its legal victory in Louisiana, where a judge granted its motion to strike down Louisiana’s Act 456, which imposes age restrictions on social media platforms. The organization called it a “decisive ruling” in favor of the First Amendment, and vowed to continue its crusade against online safety legislation.
Alas, nothing gold can stay. A coalition of 29 states plus the District of Columbia is supporting Louisiana’s request to push back against the ruling, and has issued an amicus brief asserting that “the district court erred in permanently enjoining Louisiana from enforcing the Secure Online Child Interaction and Age Limitation Act.”
Florida Attorney General James Uthmeier, who is leading the coalition, says age restrictions are “a constitutionally valid response to the harm that platforms are causing children.”
Social companies engage in ‘predatory business practices’
U.S. District Court Judge John deGravelles’ decision in the Louisiana case hinged on the assertion that Louisiana officials “failed to establish that social media causes health harms to minors.” DeGravelles also argued that, regardless of harms, the law still violates the First Amendment, because “the state seeks to regulate minors’ access to speech on social media platforms.”
Harms will be easier to prove in the wake of the California case, leaving the First Amendment to bear the full weight of NetChoice’s case. The coalition argues that “even if the Act triggers First Amendment scrutiny, NetChoice is still wrong that it is entitled to a permanent injunction because the Act regulates conduct, not speech, and at most imposes only incidental burdens on expression.”
On a larger scale, however, interpretations of what social media is fundamentally may change interpretations of how the First Amendment applies to it. The Florida-led coalition says age verification laws are a reasonable response to social platforms’ sinister intentions. They say Louisiana’s law is “a quintessential consumer-protection law: it protects Louisiana’s most vulnerable citizens from predatory business practices.”
Per the brief, “rather than scale back their predatory business practices – which range from confusing account-holder contracts to design features that manipulate kids into compulsively using social media – tech companies have doubled down on their nothing-to-see-here approach. They refuse access to their data, bury evidence of the harm they are inflicting on children, and battle regulation at every turn.”
NetChoice has argued that putting age checks on social media sites is comparable to checking someone’s age before they can use the library. But what if social media becomes, in the public mind, less like a library and more like a crack den?
NetChoice is expected to file a response with the 5th Circuit by May 26.
Discord in Florida over age assurance law HB 3
Florida’s Attorney General and Big Social are also at odds on home turf. Florida’s law prohibiting kids under 15 from creating social media accounts went into effect last month, and Meta is set to begin removing minors’ accounts in May.
In recent media appearances, Uthmeier – a former chief of staff to Florida Governor Ron DeSantis – has warned that companies will be fined $50,000 per violation, and threatened to pursue “heavy damages” that could swell to billions if platforms do not come into compliance. But he hopes it doesn’t come to that.
Social media companies, he says, “know that kids are suffering on these applications. They know the predators are getting to kids. So, we’re encouraging companies, ‘Come in, sit down. Let’s work together. Let’s protect our kids at all costs.”
Uthmeier has directed extra ire at Discord. A report from My Florida Legal says the AG has opened a civil investigation and subpoenaed the company, demanding documents and information related to its practices in marketing to children, enforcing age verification requirements, content moderation, parental control features and reporting.
“Many of our criminal investigations into internet child predators lead to one place: Discord,” Uthmeier says. “Groomers and predators seem to believe that they can get away with targeting children on Discord – and we are going to find out why. Discord owes us an explanation on the overwhelming use of its platform among predators, and what they are doing to protect children.”
Florida man on mission to follow UK, Australia age laws
Uthmeier’s deep dive into the harms of social media also has indirect political significance. The Republican-appointed attorney general appears to be working from the same playbook as UK regulator Ofcom and Australia’s eSafety Commissioner – both reviled by certain U.S. leaders for purportedly censoring U.S. companies. Florida’s crackdown suggests that, in the end, it may not matter what Big Tech thinks about foreign regulations on their platforms, because there will be homegrown regulations that do the same thing.
Article Topics
age verification | Florida | legislation | Louisiana | social media | United States







Comments