Bosses like to watch. Workers being biometrically surveilled want to walk
The same problems causing friction for users of AI-based proctoring software are plaguing employees who are subject to surveillance as they work remotely.
Among them are accusations (and refutations) of racial bias in facial recognition algorithms, poor software performance and resentment among those monitored.
There is an added dimension with workplace surveillance: People reportedly are telling supervisor-spies to pound sand, and leaving their jobs rather than be watched continuously.
A global, cross-industry survey paid for by cloud-services vendor VMware finds that 70 percent of 7,600 large firms responding to queries are using or preparing to use digital surveillance to make sure remote employees are working.
About 40 percent of organizations monitoring or planning to monitor workers are reporting “drastically increased” or “increased” attrition, according to the survey, conducted by independent research firm Vanson Bourne.
Twenty-nine percent of surveyed companies who are acting on mistrust are using video surveillance; 28 percent of use attention-tracking webcams.
Other aspects of the digital office being watched are email, browsing, collaboration tools and keyboard activity.
Employees could be quitting on principle. Surveillance might just be a symptom of a company culture that they disagree with. Or the workers could be fed up with buggy, biased or time-wasting apps.
A Washington Post article focusing on contract attorneys who work remotely from employing firms sketches a new “underclass” of workers whose worth to companies is on par with Internet of Things devices.
In the story, a contract employee complains that items of her appearance that differ from her lighter-skinned coworkers — Bantu knots in her hair, for instance — caused the face biometric algorithm to question whether she should be trusted.
Sometimes, she was forced to rescan her face from three angles 25 times or more in a day. The lawyer claimed her supervisors did not take the situation seriously, even though hers was a demanding, high-productivity job.
Not mentioned in either piece is the fact that these applications can record all of the audio and the images surrounding an employ every second they are operating. Even people who are solidly part of the team likely would not want a supervisor to sit with them in their home staring at them as they work.
Proctoring apps, which perform ID authentication, object recognition, behavior monitoring and, sometimes, so-called emotion recognition, have many of the same shortcomings.
They are more likely to misidentify people of color, particularly women of color, than white male counterparts, and they reportedly throw time-consuming errors during timed exams.
The matter has drawn the attention some U.S. senators who a year ago asked the industry to account for itself.
Article Topics
accuracy | AI | biometric-bias | biometrics | facial recognition | monitoring | privacy | surveillance | workforce management
Comments