What is the private sector’s role in facial recognition use by Indian law enforcement?
Private corporations’ involvement in the design and procurement of facial recognition technology for law enforcement in India poses privacy risks for biometrics, opaque processes, private incentives impacting policy and a delegation of surveillance functions from the state. These are findings from the latest paper on facial recognition technologies from the Centre for Applied Law & Technology Research (ALTR), the interdisciplinary policy initiative of the Vidhi think tank.
But the authors are concerned that the issues at stake are failing to gain the attention of civil society in general as biometric and surveillance technologies become more sophisticated and widespread and move into private-use areas such as real estate security.
In their summary of the working paper (available to download within the summary) in the context of the “already common involvement” of the private sector in providing facial recognition for law enforcement, authors Ameen Jauhar and Jai Vipra ask how these private firms are accessing the vast datasets required for developing algorithms for facial recognition.
“There are also concerns around whether the private sector should have unbridled access to vital biometric data (including facial scans) of individuals to design such technology purportedly for state security purposes, in complete opacity,” they write.
Are private firms engaging in surveillance services beyond selling technologies to law enforcement agencies? The authors are concerned that the state is delegating functions which should be exceptional even for the state to undertake. Back-end assistance could also give private companies far more insight into materials gathered.
The investigation into this area, including freedom of information requests, did not yield results due to the overall opacity around the deals. The authors consider the lack of transparency itself to be one of the most concerning issues: “the conspicuous lack of any information in the public domain is not only disconcerting, but a critical red flag in the whole process. It is unclear how certain private sector companies have repeatedly (as per reportage) been offered these procurement contracts, under what terms, and what kinds of checks and balances are governing this entire cycle.”
This could mean that policy may no longer be based in public priorities for technology in law enforcement, but in private sector incentives, such as profit. “Venture capital funded FRT companies can discount the price of this technology and push its use in law enforcement contexts, without any public input or scrutiny,” note the authors.
Meanwhile, the state can use exceptions of national security and law and order to justify surveillance.
Ultimately, kept at arm’s length, the public would lose control over state surveillance activities: “private sector involvement can reduce public control over outcomes important to the public as a whole – in this case, opaque and unbridled use of privately provided FRT in law enforcement reduces public control over coercive state activities.”
The working paper concludes with a recommendation that “transparency in FRT agreements, algorithmic regulation, clarity on data protection responsibilities, stricter legal restrictions on surveillance and public involvement in decision making over FRT be implemented to minimise the harms caused by the private provision of FRT for law enforcement.”
There is some scrutiny already taking place. In the same week as the paper’s publication, an activist in Telangana State has filed a petition with the state’s High Court to review the use of facial recognition by law enforcement. The activist had been asked to remove his COVID mask so that he may be photographed for identification.
Coverage of the paper has taken hold and Medianama carries a useful analysis, examining legal liability of facial recognition technology – is it for the state or private sector?