FB pixel

Apple contractors hear private moments while investigating accidental Siri activations


Inadvertent and mistaken activation of Siri on Apple devices has led to contract workers who review recordings being sent audio footage of many personal conversations and other interactions that were not meant to be shared, according to Ars Technica.

A contract worker told The Guardian that similar sounding words to the wake word “Hey Siri,” or the sound of a zipper could activate the voice assistant, and Siri is automatically activated whenever speech is detected on a raised Apple Watch.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the source told The Guardian. “These recordings are accompanied by user data showing location, contact details, and app data.”

Less than one percent of activations are reviewed, the audio footage is not linked to the individual’s Apple ID, and the company has confidentiality requirements for contract workers. Given the task of these workers includes investigating accidental activations, it seems inevitable that they would hear things that were meant to be private.

Both Google and Amazon have been put in the spotlight by similar revelations. A Google contractor recently leaked voice data showing reviewers heard private conversations about sensitive subjects, and some people were shocked to learn Amazon Ring employees and others review recorded material for annotation and continued system improvement. Amazon was also recently hit by a biometric data privacy lawsuit in Illinois for Alexa recordings.

Related Posts

Article Topics

 |   |   |   | 

Latest Biometrics News


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Research

Biometrics White Papers

Biometrics Events

Explaining Biometrics