Canada’s Privacy Commissioner says TikTok collects kids data without reason

It’s becoming increasingly clear that social media is at a tipping point. Having captured a huge percentage of the internet and peoples’ attention globally, with the major platforms boasting billions of accounts, these superstructures are beginning to quake under their own weight.
The cracks appeared long ago. They have provided a vehicle for misinformation and disinformation that has radically polarized politics. They have brought social interaction and public debate onto their private platforms that guzzle data. They make our kids miserable and addicted, using crafty engineering and design. They enable new forms of harassment and bullying. And they are increasingly controlled by billionaires in thrall to the most deeply partisan and authoritarian U.S. administration in living memory.
All of this has brought social media increasing attention from regulators. Australia is preparing to enforce age assurance laws for social media, with other regions, including the EU, looking to follow. Legal battles have erupted on multiple continents, as the lobby for Big Tech litigates emerging age check legislation, Whack-a-Mole style. For all of its resources, it can hardly keep up, as more and more lawmakers take a magnifying glass to social platforms’ policies, practices and values. This week, UK Liberal Democrat leader Sir Ed Davey urged British media regulator Ofcom to prosecute X owner Elon Musk, branding him a “criminal.”
TikTok not going enough to keep kids off platform: OPC
TikTok has had its share of disruption already, as the U.S. and China barter back-and-forth over the sale of the Chinese platform to U.S. owners. It has been deemed a threat to national security, not to mention a place where youth tell other youth to eat laundry detergent, chug Benadryl and abandon footwear. (This week, TikTok had large swaths of people believing the Biblical Rapture was set to happen on Tuesday; it did not.)
Now, an investigation by the Office of Privacy Commissioner of Canada has found that TikTok has not done enough to stop kids from using the app, or to protect their personal data.
A statement from Privacy Commissioner Philippe Dufresne lays out the findings. “Despite the fact that the application uses the information that it collects, including biometric information, to estimate users’ ages for its own business purposes, our investigation found that the measures that TikTok had in place to keep children off the popular video-sharing platform and to prevent the collection and use of their sensitive personal information for profiling and content targeting purposes were inadequate,” Dufresne says.
“This investigation also uncovered the extent to which personal information is being collected and used, often without a user’s knowledge or consent.”
The joint investigation, which also involved the Commission d’accès à l’information du Québec, the Office of the Information and Privacy Commissioner for British Columbia, and the Office of the Information and Privacy Commissioner of Alberta, set out to “examine whether TikTok Pte. Ltd.’s collection, use and disclosure of the personal information of individuals in Canada through its social media platform complied with federal and provincial private sector privacy laws.”
Collectively, the regulators say TikTok “must do more to keep underage children off its platform,” and do a better job of explaining data collection and consent to young users.
The OPC says that, in response to the findings and recommendations, TikTok has “agreed to enhance age assurance methods to keep underage users off TikTok. It has also agreed to strengthen privacy communications to ensure that users, and in particular younger users, understand how their data could be used.”
The findings have made headlines around the world. According to the BBC, TikTok has issued a statement saying that, “while we disagree with some of the findings, we remain committed to maintaining strong transparency and privacy practices.”
However, Canada’s investigation raises a noteworthy and under-discussed point in the social media debate: it’s harder to police users who are just there to look.
“The Offices determined that the tools implemented by TikTok to keep children off its platform were largely ineffective. This was particularly true in respect of the majority of users who are ‘lurkers’ or ‘passive users’, who view videos on the platform without posting video or text content.
The discussion about age verification or estimation tools for social media tends to focus on account creation – i.e., a user must be at least 16 to create an account. But that misses a large swath of users who could still be exposed to content, even if they’re not otherwise active on the platform.
Documentation on the findings says that, “ultimately, the Offices found that TikTok was collecting and using the personal information of children with no legitimate need or bona fide interest, and that its practices were therefore inappropriate” under Canadian law.
It is now well understood that social media’s business model is built on collecting user data. That is fundamental to the architecture and purpose of the platforms as they exist. It has been nearly ten years since the Cambridge Analytica scandal, which saw the UK consulting firm collect the personal data of millions of Facebook users without consent. A 2018 piece in the Guardian explains that Facebook can access your webcam and microphone, and that it stores “every message you’ve ever sent or been sent, every file you’ve ever sent or been sent, all the contacts in your phone, and all the audio messages you’ve ever sent or been sent.” This past summer, X’s AI, Grok, declared itself “MechaHitler” and began replying to random posts with opinions about a would-be “white genocide” in South Africa.
We may soon collectively come to the question of whether social media as we know it is compatible with the values and laws that undergird our societies.
Philippe Dufresne says the ultimate goal of Canada’s investigation into TikTok is “to create a safer, more transparent online environment for children, where they feel empowered to exercise their privacy rights and where they can safely explore, learn, and grow without compromising their privacy or security.” That may not be possible without large-scale reform of social media practices and regulations – if it’s possible with social media in play at all.
To quote Sir Ed Davey on the UK situation, “Ofcom now have the powers under the Online Safety Act. I know that’s a new act and maybe we need give them a little more time, but I personally think they need the encouragement and the support to take on powerful people and we shouldn’t just let powerful people get away with it.”
In the case of social media tycoons, they may not, for much longer.
Article Topics
age verification | biometric data | biometric identifiers | children | data privacy | social media | TikTok | X (twitter)






Comments