Human contractors are employed by Apple to review Siri recordings which contain confidential data of users. Some interactions with Siri are sent to Apple contractors who are tasked to determine whether the responses provided by Siri were helpful to the user and if Siri was triggered unintentionally.
Earlier this month, we found out that Google is employing workers to listen in to your conversations with the Google Assistant, similar to what Amazon does with Alexa users. Now Apple has reportedly been caught listening to recordings of Siri conversations after a whistleblower revealed that contractors hear sensitive, private and confidential information even if Siri is triggered accidentally.
Human contractors are employed by Apple to review Siri recordings which contain confidential data of users, reports The Guardian. An anonymous individual told the publication that interactions with Siri are sent to Apple contractors who listen to the recordings and grade it on various factors.
These factors are then judged to improve the experience Siri offers by confirming whether the responses provided by Siri was helpful to the user and the number of times Siri was triggered unintentionally. The news comes weeks after Apple made fun on Google over user privacy when the company itself didn’t explicitly say that other humans are listening in on your conversations too.
The whistleblower also mentioned that the recordings sometimes include confidential information, addresses, business arrangements and even recordings of drug deals. The contractor claims Siri sometimes toggles on with the sound of a zipper or when the Apple Watch is raised which explains the accidental nature of these recordings.
As a reply on the matter, Apple confirmed to The Guardian that it does analyse a small number of Siri requests for the “purpose of improving Siri”. The company also claimed that it stores and judges a small, random subset (less than 1 percent) of daily Siri interactions which lasted a few seconds.
In Apple’s words “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements”.
An interesting thing to note is that while Google and Amazon do allow users to opt out of post-processing of their recordings, Apple hasn’t pushed a privacy-focused feature on Siri yet.
You might like this