Panel explores how to do age assurance and in the face of ‘legislative tsunami’
Age assurance technologies have become an issue of global concern, with consequent regulatory disorder. In “Getting Age Assurance Right,” a BBB National Programs Live Event that is now available to view on the non-profit industry regulation organization’s YouTube channel, a panel of verification and privacy professionals lay out the challenges facing governments, businesses and vendors across the sector, and offer proposals on how to meet them.
Bailey Sanchez, senior counsel for youth and education privacy with Future of Privacy Forum, says the landscape has changed from a legislative and regulatory perspective, but also in how conversations about privacy and identity are framed.
“I used to say specifically ‘privacy protections’ but that’s no longer the case,” says Sanchez. “I intentionally use this very vague word or phrase of ‘creating protections for kids,’ because sometimes it’s about privacy, sometimes it’s about privacy and safety, sometimes it’s about something entirely different.”
Many people associate age assurance systems with age-restricted products like alcohol, vapes and porn. But there are other threats and factors driving increased regulation, such as the emergence of age-restricted activities like online gambling or social media. Iain Corby, executive director of the Age Verification Providers Association (AVPA), says that beyond age-restricted products, there is the matter of “wider harms protection,” or “what we call the four Cs” – content, conduct, contract and contact. Content could be x-rated or graphically violent videos. Conduct refers to bullying or harassment. Contract is what happens when kids are encouraged to spend real money on videogame or in-app purchases.
The fourth C, contact, is the most insidious, says Corby. “It’s one where we’re actually often checking that people are under a certain age, not over an age, because we don’t want adults pretending to be children and then persuading other real children to do nefarious things online, creating what we call new Self-Generated Child Sexual Abuse Material.”
Regulatory complexity leads to confusion, cost and friction
Corby, whose organization includes biometrics firms Yoti, IDVerse and FaceTec among others, says regulatory disharmony creates challenges that could be overcome with better interoperability and more extensive data privacy laws, particularly in the U.S., which does not have one principal federal data protection law governing all states.
“In the UK and in Europe, there is a minimum age of consent, somewhere between 13 and 16,” says Corby, pointing to the UK GDPR and Digital Services Act. “Which is always a little bit of a watch out for my American colleagues, who just assume that copper is king and the U.S. Federal law applies to the entire world, and therefore 13 is the only age you need to worry about. No, you would be wrong.”
Corby refers to a “tsunami of legislation” around the world, all of which has slight (or serious) differences. He says that in the last 12 months in the U.S. alone, “there were 144 pieces of state legislation identified which all required age assurance. And the real problem is each one of those has a slightly different way that it wants to get done. Which is horrendous for international, national or global platforms to try and comply, when you’ve got different requirements in Miami and Los Angeles.”
A law such as the Arkansas Social Media Safety Act, which requires companies to use a third-party age assurance vendor, gives wide leeway to what the vendor can do, but restricts the tender to companies whose principal place of business is in the U.S., ruling out competition from international biometrics and digital ID firms.
Further complicating matters is the issue of VPNs, which mask where a user is logging on from.
“It’s very easy with the flick of a switch to transplant myself from being in Brussels to being in New Zealand 20 minutes later by turning on a VPN,” says Corby. “I’ve never seen a piece of legislation that says, ‘we wish to protect children in this particular U.S. state (unless they use your VPN, in which case it’s okay).’ That’s not how the legislation is written. It doesn’t matter whether you are reaching the internet directly, using a VPN or using a piece of string and two paper cups. You are still obliged as a platform where that law is in place to protect children and make sure they’re not on your site seeing adult content.”
Sanchez says an age appropriate design code could help organizations determine what kind and severity of age assurance they need. Both she and Corby agree that a layered approach is needed, and that compliance is both a driver and an advantage, giving firms with solid international standards certifications an advantage.
Article Topics
age verification | AVPA | biometrics | Future of Privacy Forum | legislation | regulation
Comments