No watchlist needed to secure public spaces: Corsight execs

Many of the threats to public safety from terrorists involve people who are not previously identified as violent extremists by law enforcement. In these events, which include the murder of two staff members of Israel’s embassy in Washington, D.C. and the Christmas market attack last year in Germany, security camera footage could potentially have saved lives with facial recognition-powered analytics, even without referring to a blacklist of suspects, according to Corsight CEO Shai Toren.
Toren and Corsight President and CSO Rob Watts discussed how vulnerable events and locations can be protected without violating the privacy of the people who visit them in an interview with Biometric Update.
One of the common concerns with the use of facial recognition in public spaces is who is being watched for – who is considered a suspect? In some cases, like in the UK, a reference database is composed of previous suspects, including individuals who are wanted for arrest but remain at large. This approach can lead to people who have not been found guilty of a crime being grouped with others who have, however, which may violate the rights of those wrongly accused.
But interventions can also be triggered without using a biometric watchlist, avoiding this risk, by analyzing the behavior of visitors.
Signs of trouble ahead
The attack on Germany’s Christmas market was discovered after the fact to have visited the venue for five hours the day before. This unusual pattern of behavior could have been detected by Corsight’s technology and prompted law enforcement to investigate or intervene.
The signals that can indicate suspicious behavior vary in terms of how long they take, Toren notes, emphasizing the importance of tuning the automated analysis process to meet the particular needs of the organization relying on it to preserve public safety.
“For that we do a lot of work,” he explains, “starting with the development of the system through the actual training and deployment with the forces including that they trust the system that they work with based on all the inputs that they are giving us and based on our experience.”
Watts underlines the need to balance the capabilities of the technology with robust and appropriate privacy measures, transparency and extensive consultation between the technology provider and security team.
Among those measures, Watts explains, all faces are redacted until they are identified as being of interest. So, the facial image of an individual who is spotted by the camera, and whose digital face signature is recorded, is not stored or revealed until it has been associated with suspicious behavior.
Law enforcement typically interacts with about three percent of the population, Watts says. That means the software must be able to deliver assessments that security operators can rely on without the benefit of knowing any subject’s history.
“What is most important for us is to be able to provide alerts or a warning of something that our users can trust. Therefore, we are looking for the patterns that we can in a very reliable way tell our users, ‘you need to pay attention to this pattern,’” Toren explains. Other signals may or may not help understand what a person is doing, in theory, but if they do not supply the requisite reliability in practice, Corsight does not use them.
Determining which signals provide that reliability takes “a lot of research,” Toren says, referring to the company’s team of researchers with PhDs that study “which of the capabilities are the ones that could be incorporated in a trustworthy way into the technology and which are not.”
The signals Corsight’s software considers include dwell time and linked association, but are complex, and Watts and Toren repeat the importance of engaging with customers during the deployment. The research, plus a deep understanding of what the customer is trying to achieve in their deployment, is needed to specify the best configuration and operational considerations to utilize, according to Toren.
Improving technology meets a developing ecosystem
As of next year, technology providers utilizing AI in Europe will have to be certified to the ISO/IEC 42001 standard for ethical and transparent governance, and Toren says those aspects are part of Corsight’s consultation with customers. Corsight was certified for compliance with ISO 42001 earlier this year.
The configuration of the cameras themselves matters less than it used to, Watts says. Better lighting raises the likelihood of successful analysis and matching, but relatively sharp angles are little challenge as long as the software can capture “50 pixels between the ears.”
Police in the UK have deployed facial recognition to capture images from people as they go about their usual business, and Toren points out that transparency reports from those deployments show the technology does so effectively. A growing number of sources of this kind of information are available, he says.
At the same time, vendors can demonstrate their trustworthiness and comfort complying with regulations through commitments like the EU’s AI Pact, which Corsight is a signatory to.
The legal landscape continues to evolve as well, with Martyn’s Law forcing venues in the UK to consider how public protection can be “enhanced.”
Satisfying this mix of legal requirements, ethical commitments and international standards with facial recognition may not even have been possible just a few years ago, but Toren says the evidence of the technology fulfilling its promise in public deployments is rapidly mounting.
“With the right technology there is a lot more than you can do in order to protect people not only from the threats that you know, but also unknown threats.”
Article Topics
behavioral analysis | biometric identifiers | biometrics | Corsight | counter-terrorism | facial recognition | law enforcement | responsible biometrics
Comments