Age assurance laws for social media heading to US Supreme Court

U.S. regulators are increasingly targeting social media. The recent Supreme Court opinion in Free Speech Coalition v. Paxton validated age checks for adult content at the highest judicial level, and social platforms appear to be next in line. To date, more than a dozen states have passed legislation regulating the use of social media by youth, including Arkansas, California, Connecticut, Florida, Georgia, Louisiana, Mississippi, Nebraska, New York, Ohio, Tennessee, Texas and Utah. More are following, including Virginia.
Virginia law includes time limitation safeguard for underage users
Law360 has a detailed analysis of Virginia bill S.B. 854, which modifies the state’s Consumer Data Protection Act to impose “stringent limitations on minors’ use of social media” starting on January 1, 2026. It is similar to Australia’s so-called social media ban for under-16s, requiring social media platform operators to determine whether users are minors under the age of 16 by using commercially reasonable methods.
If a user is determined to be under 16, they are subject to a social media time restriction of no more than one hour per day, unless a parent consents to an increase – a novel feature in Virginia’s bill.
The bill also prohibits social media platforms from using any information they collect in the age assurance process for any other purpose.
Law360 predicts that “the proliferation of state laws requiring age verification likely will lead to an industry-standard method that is widely used,” but notes that “such a universal method does not yet exist.”
NetChoice brings social media for age assurance to SCOTUS
Various laws come with various restrictions and specifics. Some put age verification at the site level, some at the app store level. Some target addictive design; some focus on targeted ads.
Most are fair game for Silicon Valley industry lobby NetChoice, which has deployed its litigation department to challenge age assurance laws for social media as they arise: “to date, social media laws concerning minors in a number of states, including Arkansas, California, Florida, Georgia, Louisiana, Mississippi, Ohio, Tennessee, Texas and Utah have been enjoined following legal challenges.”
Now, once again flexing its litigation muscle, the group has filed an emergency application to the U.S. Supreme Court regarding Mississippi’s I.D.-for-Speech law, HB 1126, urging the justices to reinstate a preliminary injunction it had previously won, but saw overturned last week.
Here, then, is social media’s own Free Speech Coalition v. Paxton: per a release from NetChoice, “the first case to reach the High Court on social media age verification.”
“Free speech is under attack, and NetChoice is fighting back, says Paul Taske, co-director of the NetChoice Litigation Center. “Social media is the modern printing press – it allows all Americans to share their thoughts and perspectives.”
“Mississippi’s censorship regime would upend the status quo by forcing people to provide their sensitive, personal information just to access fully protected speech online. That is a massive First Amendment violation.”
Culturally, the U.S. is deeply committed to a constitutional ideal of free speech that likely favors Silicon Valley. Social media does not come with the same perceived seediness that colors the porn debate; even if it’s easy to find graphic penetrative intercourse on X, it’s also where many news organizations and public figures continue to communicate. Conversely, no one visits XNXX to find opinions on politics or the NBA draft. Taske’s comparison with the printing press is overblown, but it is true that social media has been integrated into public life in a way that would likely be impossible for porn.
“Courts across the country agree with us,” says the litigator. “NetChoice has successfully blocked similar, unconstitutional laws in other states. We are confident the Supreme Court will agree, and we look forward to fighting to keep the internet safe and free from government censorship.”
Denmark wants more ambitious EU approach to online safety laws for children
Meanwhile in the EU, Denmark is making good on its promise to prioritize online child protection over the next six months, during which it holds the chair of EU policy talks. A draft text from the Danish government is being circulated to EU member states ahead of an October meeting of EU telecom ministers, asserting the importance of age-appropriate design.
According to a report from MLex, the text focuses on tightening age checks (and potentially raising the minimum age) for social media and other digital services, and banning addictive design practices at the EU level. It singles out features such as infinity scroll and “streaks,” which encourage users to interact daily with friends by tracking consecutive days of messaging, as well as “certain loot boxes in videogames based on addictive mechanisms.”
Effectively, Denmark says the EU is not yet doing enough to protect children from potentially harmful content online, and asks whether “a part of a broader set of requirements” is required, such as parental controls on devices. It also notes the capabilities of current technical tools for age verification or biometric age estimation, which, if adopted broadly, “could minimize friction and improve consistency across services.”
Article Topics
age verification | children | Denmark | Netchoice | regulation | social media | United States | Virginia






Comments