TikTok hit with biometric data privacy suit under GDPR
TikTok and its parent company are facing a lawsuit from a former UK regulator over allegedly intentional violations of children’s data protection laws in both the UK and EU, for the app’s collection of biometrics and other data, according to a website launched in support of the filing.
Former Children’s Commissioner for England Anne Longfield OBE launched the lawsuit, claiming that not only did TikTok and parent company ByteDance illegally collect data from children who used it for since Europe’s General Data Protection Regulation (GDPR) went into effect in May of 2018, but that they did so knowingly. TikTok does not provide sufficient notice or transparency, and does not collect the necessary consent, Longfield alleges. Further, the company is deliberately opaque about who has access to the data it collects.
The biometric data collected by TikTok refers to facial recognition data. It, along with other personal data, was collected for the benefit of unknown third parties (potentially such as advertisers), according to the suit.
A decision may be some time coming, as the case is currently stayed, pending the result in the Lloyd v Google case currently playing out in UK court. That case is expected to have implications for the case against TikTok.
The suit alleges that TikTok collected the biometrics and other data, which it uses to generate advertising revenue, in violation of GDPR articles 5, 12, 14, 17, 25, 35 and 44.
The social media app is currently fighting to have a class-action settlement in the U.S. held to a mere $92 million. In that case, expected low settlement claim rate has prompted consideration of how notification should be delivered, legal services provider Epiq writes for JDSupra. An email notification, as planned, is estimated to result in only a 2 percent claims rate. Using push notifications or a direct message within the app would likely reach far more people.
ExamSoft seeks suit removal to federal court
ExamSoft has filed to have a biometric data privacy lawsuit filed against it removed to U.S. federal court, on grounds it meets the criteria set out in the Class Action Fairness Act (CAFA).
CAFA dictates that class action suits can be removed to Federal Court if the amount sought exceeds $5 million. In this case ExamSoft says in the filing the amount in controversy, based on a minimum of 2,000 putative class members, more than doubles that amount. The suit would also have to include more than 100 members, some of whom are out of state.
The suit was originally filed in March in Cook County Circuit Court, as was a separate suit against biometric proctoring provider Respondus.
Despite these concerns, and mounting evidence that implementing online proctors is more work than many realize, as a technical support officer told The Conversation about biometric proctoring services in Australia, it is not often institutions “un-procure” technology.
Article Topics
biometric data | biometrics | data protection | facial recognition | GDPR | lawsuits | legislation | mobile app | privacy | TikTok
Comments