US State AGs target social media, porn platforms over kids’ online safety compliance

It’s the week before Christmas, and all through the States, the lawsuits are flying like tossed dinner plates. Indiana Attorney General Todd Rokita is suing dozens of porn websites for alleged noncompliance with the state’s age verification law. Tennessee Attorney General Jonathan Skrmetti is suing Roblox for promising safety while luring kids into a harmful environment. Iowa Attorney General Brenna Bird is suing Roblox, too, accusing the platform of enabling child sexual exploitation. A court in Massachusetts is hearing arguments alleging that Meta designed features on Facebook and Instagram to addict young users. As many as 29 states are ramping up pressure to fold the various lawsuits filed on similar grounds into a single, unified action.
Regulatory and litigatory tides turned against social platforms in 2025, which may end up being flagged as the official Year Facebook Died. With a decade and a half of evidence behind them, researchers have clearly documented how social media has caused harm to young users. The damage comes in a variety of flavors: algorithmically curated feeds, influencer psychology, rampant misogyny, political disinformation, mechanics modeled on slots. The problem is not going away, and governments have realized that some version of the same regulatory measures placed on so-called adult content – pornographic websites – is needed for social media.
The massive companies that control social media will not go gently into any kind of limitation. In the U.S., the legal lobby group NetChoice has brought Meta’s fight to various pieces of age assurance legislation, brandishing the First Amendment like a cutlass, and showing no sign of slowing down.
Indiana devotes significant gov’t resources to accessing porn
Meanwhile, Aylo, the operator of some of the largest porn sites, has made good on compliance promises despite its losses in court, blocking access in states that enact age verification laws. Now, however, Indiana, which passed its law in 2024, says disappearing from the state isn’t good enough, and is suing Aylo on the grounds that made “false and misleading statements regarding the accessibility of the pornographic websites by Indiana residents” – because those sites can still be accessed through the use of a Virtual Private Network (VPN).
According to a report from Reason Magazine, the suit from Attorney General Rokita seeks “injunctive relief, civil penalties, and recovery of costs incurred to investigate and maintain the action.” And it wants Aylo to block VPN use.
The author of the article, Elizabeth Nolan Brown, reads the idea as it should be read. “This is an insane – and frighteningly dystopian – interpretation of the law.” As with many dystopian scenarios, it also comes with a dose of comedy. Evidence for the suit shows Indiana state employees taking to the VPN cubicle with a box of tissues to test the limits of Aylo’s block, alongside other porn sites. The investigators made the rounds, accessing content on Brazzers, Redtube (both Aylo products), FakeTaxi, Spicevids, Letsdoeit and other sites, clearly aiming to be as thorough as possible.
Moreover, says the complaint, “although Defendants have supposedly been restricting access in Indiana by blocking Indiana IP addresses since around June 27, 2024,” Defendants know that its adult oriented websites continue to be accessible by consumers located in Indiana, and Defendants continue to track those Indiana users. On February 14, 2025, Defendants posted a Valentine’s Day 2025 graphic showing Pornhub’s top relative searches in certain states, which included Indiana.” (For the record, the embedded graphic lists Indiana’s favorite search term as “queef.”)
Indiana’s interpretation of its law will surely meet Constitutional challenges from both porn operators and digital rights groups. Reason quotes David Greene of the Electronic Frontier Foundation (EFF). “What the state’s lawsuit seems to be doing is saying that Aylo deceived Indiana consumers when it said it was geoblocking Indiana users from its sites when it knew that VPN users might be able to evade that geoblock,” Greene says. “It essentially bases liability on the failure to accomplish impossibilities.”
Roblox continues to try and wave away hellscape allegations
Roblox has become a favored target for litigation, fueled by allegations that the social gaming platform has turned a blind eye to grooming and child sexual exploitation on its site. In a report from WSMV, Tennessee’s AG Skrmetti says the company violates the Tennessee Consumer Protection Act (TCPA) and calls it “the digital equivalent of a creepy cargo van lingering at the edge of a playground.”
“Roblox invites children into a fantastic online world with the promise of creativity and play, but that wonderland is a trap that lets the company sell sophisticated predators access to those vulnerable kids,” says the AG. “Roblox worked to reduce oversight and child safety resources despite repeated warnings, because less overhead meant more profit. And the whole time, the company lied and said safety was its top priority.”
A response from Roblox’s Chief Safety Officer Matt Kaufman says Tennessee’s lawsuit “fundamentally misrepresents Roblox and how it works.”
“Roblox is built with safety at its core, and we continue to evolve and strengthen our protections every day. We have advanced safeguards that monitor our platform for harmful content and communications. Users cannot send or receive images via chat, eliminating one of the most prevalent opportunities for misuse seen elsewhere online. Safety is a constant and consistent focus of our work, and we are currently rolling out additional measures to further limit who users can chat with.”
The company has tried hard to frame itself as a leader on child safety, implementing biometric facial age estimation from Persona to safeguard its chat feature. But it has a long way to go to convince lawmakers. Iowa’s lawsuit against the platform has AG Bird calling Roblox “a breeding ground for sexual predators” and saying “Iowa’s children are paying the price,” according to the Des Moines Register .
Iowa wants a judge to enforce stronger safety protections or potentially ban Roblox from the state. The attorney general is also seeking restitution and civil penalties of up to 40,000 dollars per violation.
Even if the company were to assuage Iowa’s attorney general, it faces a long grind through the machinery of the U.S. court system, with at least five states pursuing action – and more to come, as South Carolina looks to join the list. According to Reuters, last week, a federal judicial panel combined nearly 80 lawsuits accusing Roblox of knowingly facilitating child sexual exploitation to be heard together before a San Francisco court.
Meanwhile, it continues to plead its case, pointing to Persona’s age estimation as an example of how it continues to improve. “Just like in the physical world, we believe conversations that happen between older teens may not be appropriate for adults and vice versa,” says a company blog. “So, with the launch of age checks, we are also limiting communication between adults and minors. We’ve partnered with child development experts to help us define common-sense age groups that can chat with each other on Roblox.”
Besides which, it says, age checks aren’t everything: parental controls, its Trusted Connections feature (which also uses Persona’s FAE), and its Community Standards all contribute to the overall online safety picture.
“Age checks help us provide all of our users with age-appropriate experiences and communication, but we know there’s always more to be done. We’re continually optimizing and adding layers to our larger system as new technology becomes available. We’ve released over 145 safety enhancements this year alone, and we have more to come.”
2026 marks five years since first report on Instagram harms
If Silicon Valley has a figurehead, it is surely Mark Zuckerberg, the CEO of Meta. (At least since Steve Jobs died.) Meta has weathered the litigation storm better than most, mainly thanks to its legal engine, NetChoice, the group that also represents X, Snap, Reddit, YouTube and other behemoths.
Nonetheless, its defenses can only withstand so much time in court, and the voices mounting against it are growing louder. PBS reports that, earlier in December, Massachusetts’ highest court heard oral arguments in the state’s lawsuit blaming Meta for purposefully making its products addictive to kids.
“We are making claims based only on the tools that Meta has developed because its own research shows they encourage addiction to the platform in a variety of ways,” says State Solicitor David Kravitz.
Meta, of course, disagrees, asserting that it is protected by the First Amendment. However, as the specific features that cause (and were designed to cause) harm come into clearer focus, free speech could become a less reliable crutch. The issue for Massachusetts isn’t what one is allowed to say, but how Meta engineers its platforms to bring people to that content, and keep them there.
“It’s not how to publish but how to attract you to the information,” says Justice Scott Kafker. “It’s about how to attract the eyeballs. It’s indifferent to the content, right? It doesn’t care if it’s Thomas Paine’s ‘Common Sense’ or nonsense. It’s totally focused on getting you to look at it.”
The first credible reports that Meta knew its products were bad for kids surfaced in 2021, in the Wall Street Journal. That investigation focused on harms to teenage girls, in the context of eating disorders, social pressure and suicide. Since then, the world has seen plenty of what social media addictions can do to boys, too. And the evidence keeps coming.
Article Topics
age verification | Aylo | biometric age estimation | children | Meta | regulation | Roblox | United States







Comments