FB pixel

Texas puts age verification on app stores despite Apple, Google pushback

Abbott decision follows on Utah law that also inspired North Dakota legislation
Categories Age Assurance  |  Biometrics News
Texas puts age verification on app stores despite Apple, Google pushback
 

Texas Governor Greg Abbott has signed a law that requires app stores to implement age assurance measures, or obtain parental consent before minors download apps or make in-app purchases.

The law is a blow to Apple and Google, both of which have argued that their app stores are the wrong place for age checks. The Wall Street Journal has reported that Apple CEO Tim Cook personally spoke to Abbott earlier in May, to try and convince him not to pass the bill.

On the other side are porn and social media platforms, neither of which want the responsibility of implementing age assurance, and have argued that age checks belong at the app store or device level.

Apple has floated its own solution to the age assurance problem: a combination of parental consent and restrictions by age range. In February, it published a white paper outlining its stance, and introduced new safety features which require users to enter an age range on new devices. Users under 13 must obtain parental consent to use the ‌App Store‌ and the biometric Face ID feature.

Its Child Accounts feature allows parents to “share information about the age range of their kids with apps to enable developers to provide only age-appropriate content.”

A selling point for Apple is that their system does not require a specific birth date. It contends that age assurance on app stores that would require the collection of personal information presents a privacy hazard.

Commenting on the Texas decision, Apple says the law “requires app marketplaces to collect and keep sensitive personal identifying information for every Texan who wants to download an app, even if it’s an app that simply provides weather updates or sports scores.”

Nonetheless, it will soon have to comply with the Texas App Store Accountability Act, as the Lone Star state follows Utah in placing age restrictions at the app store level.

North Dakota allows parents to hold noncompliant sites liable for damage to kids

Lawmakers in North Dakota have also cited Utahs’ age assurance legislation as a model, as the state passes expanded regulations and penalties relating to pornography and deepfake content.

The North Dakota Monitor reports that Governor Kelly Armstrong signed bills adding age verification requirements for adult content websites, and civil penalties for the creation and distribution of non-consensual deepfake porn.

Senate Bill 2380 and House Bill 1561 both require age assurance for websites containing “a substantial portion of pornographic material that could be considered harmful to minors.” Websites that fail to comply can be held liable for damages by a parent or guardian of a minor who accessed the explicit content, or by a person whose personal information was retained after age verification.

The law exempts internet service providers, search engines, cloud services and application stores from liability.

Politically, the debate is unfolding along familiar lines, as elected officials vie to save the children, while advocacy groups fearing political misuse of private data lean into the First Amendment.

Rep. Steve Swiontek, R-Fargo, says “we have a moral obligation for these kids” and “if we can prevent 90 percent of these things from happening, then it’s been a success. And then, it can be tweaked two years from now as well.”

Cody Schuler, advocacy manager for the American Civil Liberties Union of North Dakota, says age assurance requirements “put undue burden upon those individuals who are legal to access pornography.”

The U.S. Supreme Court is expected to issue its highly anticipated ruling on a similar age assurance law in Texas within the next two months.

Meanwhile, addressing a related area of digital crisis, North Dakota’s House Bill 1351 makes it a misdemeanor to “create, possess and distribute sexually expressive images, including real, altered or computer-generated deepfakes, that show nude or partially denuded figures without consent.” Victims can file civil lawsuits to recover up to $10,000 in statutory damages caused by violations of the law.

Related Posts

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Face biometrics use cases outnumbered only by important considerations

With face biometrics now used regularly in many different sectors and areas of life, stakeholders are asking questions about a…

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

 

Europol operated ‘shadow’ IT systems without data safeguards: Report

Europol has operated secret data analysis platforms containing large amounts of personal information, such as identity documents, without the security…

 

EU pushes AI Act deadlines for high-risk systems, including biometrics

The EU has reached a provisional agreement on changes to the AI Act that postpone rules on high-risk AI systems,…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events