Tech giant Apple has been paying contractors to listen to user's private Siri data with former contractors even admitting that they have heard personal data including conversations, doctor's appointments, etc.
Contractors are paid to listen to Siri data and decide whether the response was relevant and useful if the response was a false trigger or not.
In a statement to The Guardian, the company acknowledged that “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” Apple also noted that less than 1 percent of daily activations are analyzed under this system.
It is known that other voice assistants like Google assist and Amazon's Alexa store, monitor and listens to user's voice data to improve their systems but until now there was no conclusive evidence on Siri doing the same.
According to The Guardian’s source, that proliferation has led to some very personal conversations making their way to complete strangers working for Apple: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing the location, contact details, and app data.”
Additionally, as The Guardian notes, while Amazon and Google allow customers to opt-out of some uses of their recordings, Apple doesn’t offer a similar privacy-protecting option, outside of disabling Siri entirely.
With Internet privacy becoming more and more significant following several data leaks from Facebook, Cambridge Analytica scandal, it becomes imperative to guard one's personal data. Do you think it's acceptable for Apple to be distributing your personal data to contractors? Let us know in the comments section below.
Contractors are paid to listen to Siri data and decide whether the response was relevant and useful if the response was a false trigger or not.
In a statement to The Guardian, the company acknowledged that “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” Apple also noted that less than 1 percent of daily activations are analyzed under this system.
It is known that other voice assistants like Google assist and Amazon's Alexa store, monitor and listens to user's voice data to improve their systems but until now there was no conclusive evidence on Siri doing the same.
According to The Guardian’s source, that proliferation has led to some very personal conversations making their way to complete strangers working for Apple: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing the location, contact details, and app data.”
Additionally, as The Guardian notes, while Amazon and Google allow customers to opt-out of some uses of their recordings, Apple doesn’t offer a similar privacy-protecting option, outside of disabling Siri entirely.
With Internet privacy becoming more and more significant following several data leaks from Facebook, Cambridge Analytica scandal, it becomes imperative to guard one's personal data. Do you think it's acceptable for Apple to be distributing your personal data to contractors? Let us know in the comments section below.