Biometrics, age-appropriate design on the mind of UK Information Commissioner

UK Information Commissioner John Edwards is on a bit of a speaking tour, having recently addressed Privacy New Zealand – a country where he spent eight years serving as privacy commissioner – and the International Association of Privacy Professionals (IAPP)’s Data Protection Intensive UK 2025.
His message is a mix of human-centric thinking and a warning to services that flout the rules, guided by the principle that “data protection is not just about computers, numbers and legislation – it’s about people.”
Edwards tells the IAPP that “since I arrived in the UK and took up this role in 2022, my aim has been to regulate for outcomes, not outputs. I’m interested in real impact, real change, rather than products, things that might be easy to count, but that ultimately don’t improve the lives or experiences of people in the UK.”
He cites real scenarios in which irresponsible data handling leading to data breaches can have grave consequences: “Imagine a person fleeing a violent domestic relationship, only to have their new address accidentally shared with their abuser.” Rather than a mere “admin error”, Edwards wants to frame these mistakes within a wider social context, notably in how they impact people in vulnerable positions.
He believes “the public should not be expected to have to read reams of legalese in a privacy notice to understand what’s happening to their information,” that “they should not be kept in the dark about where their information’s going and what it’s being used for,” and that “they should not be coerced or forced into handing over their information without being fully informed about the consequences.”
While consumers can play a role in the larger task of data protection, Edwards says “there is a limit. It’s the responsibility of the whole organisation, from the C-suite down, to keep things simple and accessible for their users.”
ICO fires ‘warning shots’ with investigations of major social platforms
To illustrate the role of the Information Commissioner’s Office (ICO), Edwards points to its ongoing investigations into TikTok, Reddit and Imgur.
“Our role is to ask questions on behalf of young people, so they don’t have to make complex decisions about how they transact with their personal information in an asymmetric information relationship,” Edwards says.
“How do these platforms protect children’s personal information? How do their recommender systems work, how are they using a 13 year old’s preferences, viewing habits, shares and likes to serve them content, and keep them on the platform? And, in the case of Reddit and Imgur, how do they assess the age of their users and tailor their content accordingly?”
From age assurance to consent in the collection of personal data, “children shouldn’t have to figure this out by themselves. It’s not their job.”
“So we’ve stepped in, because it’s our job as the UK’s data protection regulator to hold these platforms to account. If social media and video sharing platforms want to benefit from operating in the UK, they must comply with data protection law.”
The decision to target three of the biggest social media and video sharing platforms is intentional – and definitely not a signal for smaller businesses to ignore the law.
“I want to make it very clear,” Edwards says. “By focusing our efforts on some of the largest, most well known platforms, we are not giving smaller companies a free pass to adopt or continue unlawful practices. Instead, last week’s announcement should serve as a warning shot. All organisations using children’s data, or who offer products or services aimed at young people, need to comply with the law and conform with our children’s code.”
“Get your own house in order. You shouldn’t wait for the regulator to come knocking on your door before checking your processes.”
Biometrics, AI still in regulatory spotlight
Edwards says the ICO continues to focus on AI and biometrics continues. “I’ve asked my team to look at foundation models, automated decision-making in recruitment and by government and how police forces use facial recognition technology. We’ll be closely scrutinising any proposed deployment of predictive profiling that could affect people’s rights.”
We’re also looking ahead at the technologies and innovations that are likely to burst onto the scene in the next two to seven years.”
Former NZ privacy commissioner touches on age assurance, FRT
Edwards’ address to Privacy New Zealand focuses on the ICO’s work in childrens’ online safety.
He says the age appropriate design code, which he “inherited,” has “directly led to a huge number of improvements in the safety of a number of the most popular platforms and digital services available to children.”
“One of the largest platforms told me that as a result of the code and our work, they made more changes to their product products than they did when the GDPR came in, and that’s huge.” He cites wins like turning off geolocation for kids by default and turning off notifications after bedtime.
Edwards reiterates that the ICO’s priorities for the year are children online, AI and biometrics – regarding the latter, “there’s a real rush to market which means that there are risks and opportunities and we need to stay ahead of those – and, finally, online tracking and advertising.
He mentions policy work consulting on papers addressing “some of the personal information implications of generative AI and how large language models are trained.”
Edwards has fines at his disposal, if needed. But he stresses that real change takes time.
“I’ve started to use the example that one of the first decisions I took in my job here in 2022 was to confirm a fine my predecessor proposed against Clearview AI ,which you know is the biometric facial recognition matching service used by law enforcement authorities based on scraped images from around the web.”
Clearview appealed that fine and won, forcing the ICO to seek its own appeal. “It took the tribunal a year to respond to our request for an appeal,” Edwards says. “We’ve now got a date for that appeal. I‘m three and a half years into my term now; there is no chance that that matter will be determined before my term ends.”
Edwards also makes brief reference to Australia’s social media ban for users under 16. Once again, his human-centric perspective guides his thinking.
“I think one of the most sensible things I’ve heard on this subject came from a real advocate who’s sometimes a pretty harsh critic of my office and me personally, and the role that we play.”
Edwards is referring to Baroness Beeban Kidron, who he calls “really the author of the kids code in many ways.”
“She was asked about this issue of a ban and her view I think was really sensible. She said it’s better to make these technologies safe for children than to try and ban them. And I think that’s right.”
That said, parental oversight only goes so far. “The thing is I wonder, you know, we expect that parents will not give their children cigarettes right? But we also make it a crime for retailers to sell children under 18 cigarettes. And I think there is an analogy there.”
Article Topics
age verification | biometric data | biometric identifiers | biometrics | children | data privacy | data protection | facial recognition | Information Commissioner’s Office (ICO)
Comments