School security summit pushes object and facial recognition tech, leans on humans
On the same day that the U.S. Supreme Court made it easier to carry a concealed firearm, a panel including four education, security, law and biometrics insiders discussed the grim business of limiting injuries when shooters walk school halls.
The discussion, billed as a school safety emergency summit and hosted by Carnegie Mellon University, which invited AI vision vendor Oosto, was subdued with moments of stridency, particularly on the part of participants who have worked in schools.
The message was not encouraging, but the panel had suggestions for actions that they feel could lower casualty counts.
The market for school security systems, especially systems using facial recognition and other biometrics, is virtually unregulated. There are few industry standards of any kind, making it difficult for administrators to compare vendors’ hardware and software.
It also is preyed upon by vendors hawking unproven, standalone products. For the most part, school leaders put more responsibility on technology than on people for spotting trouble before tragedy. All the while, some school districts are arming teachers while cutting the time required for training.
And efforts by human rights activists and, sometimes, community members can limit systems to object recognition rather than incorporating software facial recognition systems. One speaker felt this perspective was politicking.
None of the panel objected to any of the points made during the summit. (Only one, Yale Law School Professor Shlomit Yanisky-Ravid, mentioned firearm regulation.)
It was moderated by The Wall Street Journal’s U.S. education reporter who opened by recounting some of the mass shootings at schools that he has covered on the scene.
Michael Matranga, CEO of security consulting firm M6 Global and a former Secret Service agent, commented multiple times on how often he sees people “trying to sell unproven products that have no methodology, no experience.”
That leaves districts trying to make critical decisions without even like products to compare, Matranga said.
At least run a pilot test to make sure a system works, and address problems identified in a risk assessment, advised Guy Grace, an Army veteran with 34 years in school safety. He had been security director at Littleton (Colo.) Public Schools.
“You don’t send a researcher or an educator to do a warrior’s job,” he said. The technology is good enough to help saves lives. But because “this industry is so heavily unregulated” buyers typically are on their own navigating unfamiliar waters with an anxious electorate demanding action.
According to a survey conducted by Oosto, 11 percent of respondents feel their child’s school is unsafe, and 60 percent would support new security steps including real-time video surveillance.
Privacy concerns are no small part of the standards argument, said Yanisky-Ravid.
In a refrain familiar to most people in the biometrics community, she said that vendors have to live by transparency, trust-ability and explainability. It is not uncommon for a student planning to kill to post telling remarks online. In the few times a school district saw pain and anger before they acted, privacy concerns have made action difficult.
In the absence of thoughtful regulation, all agreed, there must be industry standards.
“It’s upsetting to me,” said Bruce Montgomery, Honeywell Integrated Security’s schools security technology expert. Montgomery also spent 25 years in law enforcement and has taught shooter response to educators for 14 years.
“It’s glaring that I’m still seeing recognition systems that are very inadequate,” he said.
If nothing in the industry changed, schools have to keep people in the security process. That means training for shootings, figuring out ways to tell school occupants where to run and such.
It could even help with the privacy conundrum. A person sees something. The school uses technology to substantiate (or refute) what was reported and then someone goes to the responsible people involved.
Assuming, as Yanisky-Ravid suggested, a facial recognition system deletes collected data every 30 days, privacy issues are minimized.
This post was updated at 3:43pm Eastern on June 24, 2022 to clarify that Oosto conducted the school safety survey, and was invited to participate in the event by host Carnegie Mellon University.
Article Topics
biometrics | computer vision | explainability | facial recognition | object recognition | Oosto | privacy | real-time biometrics | school security | schools | standards | video surveillance
Comments