Chinese court rules in favor of biometric data privacy complaint as regulations tighten
A court in China has ruled that the mandatory collection of face biometrics from members of a wildlife park in Hangzhou is illegal, Sixth Tone reports, amid a suddenly-changing data privacy context in the country.
The court also ruled that the Hangzhou Safari Park must pay a 1,038 yuan (roughly US$160) partial refund of the plaintiff’s membership fee and travel expense compensation.
The plaintiff, a law professor at Zhejiang Sci-Tech University, is reported to be happy with the decision up to a point, but not all of his requests were granted, and he plans to appeal the decision. His legal representative told Sixth Tone that the plaintiff had hoped the court would deliver an opinion guiding the use of or limitations on facial recognition technology in general.
The court judged the illegal policy did not constitute fraud because it did not result in any harms.
“We hope this case will push our whole society to come up with a more refined definition of the boundaries of collecting information as sensitive as fingerprints and facial features,” the plaintiff’s lawyer, Ma Ce, told Sixth Tone.
The public feedback period for China’s proposed personal data protection law ends this week.
Data protection law details emerging
A new English translation of the proposed law, which consists of eight chapters and 70 articles, has been posted by think tank New America.
Article 29 defines “sensitive personal information” as anything that could cause discrimination or harm to the individual or their property, “including information on race, ethnicity, religious beliefs, individual biometric features, medical health, financial accounts, individual location tracking, etc.,” according to the New America translation.
The other four articles in the sensitive personal information section within the chapter on handling personal information set a distinction between consent-based handling of sensitive personal data and legally mandated uses. Consent must be specific, and written consent may be required in some cases. Individuals have the right to be notified by those handling their sensitive personal data,
In an analysis posted to LinkedIn, Future of Privacy Forum Senior Counsel for Global Privacy Gabriela Zanfir-Fortuna notes that the definition of personal data is broad, and includes “identifiable” data like GDPR. The law also applies to data collection, and therefore imposes requirements prior to biometrics enrollment. Zanfir-Fortuna notes that the law applies to “controllers, joint controllers and processors” of data, applies to the public sector, and applies to entities based outside of China that provide goods and services to people in the country.
Automated decision-making and other “major” systems must perform risk assessments similar to data privacy impact assessments (DPIAs), and the law has a complex enforcement regime, which includes fines of up to 5 percent of a company’s turnover, administrative actions and a version of class action. An independent body dedicated to the enforcement of the law is not established, however.
Hong Kong Commissioner publishes biometrics proportionality criteria
Hong Kong’s Office of the Privacy Commissioner for Personal Data has issued a guidance note on the collection and use of biometric information, offering criteria to help organizations determine if the use of biometrics is proportionate.
The criteria are based on a four-point test set down in a court ruling in the 2016 Hysan Development Co Ltd v Town Planning Board lawsuit.
Organizations should ask if the use of biometrics pursues a legitimate aim, is connected to achieving the goal, is limited to necessary steps for achieving that aim, whether a reasonable balance is achieved between social benefits and protecting individuals’ rights. The Commissioner states that covert collection of biometrics is highly intrusive, and that privacy impact assessments should be conducted and individuals notified if their biometrics are being used in an automated decision-making system.
Regular independent audits and evaluations should also be conducted, and questions of necessity and proportionality revisited during them.
The guidance also contains sections on avoiding function creep, transparency, and staff training.