China signals serious intent to regulate facial recognition and other biometrics
A Chinese government spokesperson is emphasizing the need for sensitive data like biometrics to be collected and used only for specific purposes, and only when “sufficiently necessary” and supported by a risk assessment, the South China Morning Post writes as the details of China’s new data privacy law continue to emerge.
The protections bestowed by the designation of biometrics as sensitive personal information in Article 29 of the law may even be increased, according to Legislative Affairs Commission spokesperson Yue Zhongming.
“The use and development of facial recognition and other new technologies has created new challenges for the protection of personal information,” Yue said, according to the SCMP. “The Legislative Affairs Commission will listen further to a wide range of opinions on this issue, and conduct in-depth research and assessment.”
Under the law, companies found violating user data privacy could be fined up to 50 million yuan (US$7.6 million), or 5 percent of annual revenue.
The SCMP also mentions some concerns raised by observers about the law’s vague compliance details.
In contrast with the commonly-held perception of lax data regulation, the Personal Information Protection Law (PIPL) is just the latest of hundreds of laws and rules passed over the past decade to boost data security and protection, McGill professor and Wilson China Fellow Xiao Liu writes for The Diplomat.
As China’s government has encouraged the development of technologies to support the digital economy, it and other stakeholders have stepped up to protect that data, according to the article.
“Personal information protection has already become a hyperactive field in China, which is continuously energized not only by national legislation and policymaking, but also by the participation of legal professionals, conscious actions taken by common citizens, as well as immense media attention and active public discourses,” Liu posits.
Law school students are also being encouraged to gain real-world experience, which many choose to do by litigating against big companies for the data-handling practices. The “no actual harm” lawsuits like that recently won by a law professor, and another involving a law professor suing a homeowners’ association for installing facial recognition at its gates, show the symbolic significance of the suits.