Onfido’s use of biometric data for bias research meets legal requirements — ICO report
The UK Information Commissioner’s Office (ICO) has released the report of regulatory sandbox test carried out with Onfido, which concluded that the company’s use of personal biometric data for research is done with respect for the rights and freedoms of the individual, and also in fairness and transparency, among other findings.
Broadly speaking, the report summarizes the ICO’s conclusions on key data protection issues to ensure that Onfido’s facial recognition technology is fair and inclusive for all data subjects.
As such, it asserts that in as much as Onfido has full control of the personal data it works with in order to advance its technology, biometric data is not a special category of personal data when the processing activity undertaken does not require individuals to be uniquely identified. This, the report states, means that the use of biometric data in the context of training the company’s AI is not subject to heightened restrictions under the EU’s General Data Protection Regulation (GDPR).
Explaining why it joined the ICO’s regulatory sandbox, Onfido said the reason was to find out how better they can lawfully measure ethnic or racial bias in order to train their system to be able to reduce such bias.
The ICO also states in its report that where data that reveals a person’s racial or ethnic origin is used for research, the most appropriate condition for processing special category data under the GDPR is likely ‘substantial public interest,’ with Onfido saying its research determined the ‘substantial public interest’ to mean the prevention of discrimination.
Onfido said throughout its work with the ICO, the company successfully demonstrated some of the measures it takes in order to ensure fairness and transparency to individuals whose data is used for research-related processing, adding that it got very useful advice from the ICO on furthering such good practices.
Onfido has pledged to continue to work with the ICO on the path of improving standards that ensure reduced risk for individuals in the manner in which AI and maching learning are developed and used by businesses and organisations.
It said the regulatory sandbox exercise enabled its staff to have a better grasp of how a mix of data protection compliance and privacy laws can avoid negative consequences on individuals.
The report ends by expressing the hope to see the ICO’s engagement with Onfido enable the former to further develop its thinking of data protection based issues having to do with complex AI supply chains. This, the report states, will allow actors involved in the chain to ensure their compliance with the UK data protection regulation, the attendant goal being privacy benefits for data subjects.
Reducing demographic differences in the accuracy of its algorithms could help Onfido scale its business, with CEO Husayn Kassai stepping down this week to transition the company to its next growth phase.
Meanwhile, Onfido is sponsoring a webinar as part of the Economist Events series on ‘A whole new (contactless) world: The rise of digital identity,’ which will be held on December 8 and 9.
Article Topics
algorithms | biometric data | biometric-bias | biometrics | biometrics research | data protection | facial recognition | Information Commissioner’s Office (ICO) | Onfido | training
Comments