Apple’s Contractors Are Listening To Your Siri Conversations
Over the years, Apple has been quite vocal about its commitment to privacy and keeping the data of its customers secure. The company has also recently released a series of ads and billboards in the United States, Canada, and Europe, to tout its dedication to privacy. However, a new report from The Guardian has left some people wondering if what happens on their iPhones actually stays on their iPhones, as the company claims.
According to the report, Apple’s contractors are being paid to listen in on recorded Siri conversations, including accidental recordings of personal moments in the lives of the company’s users. The report shares claims from a source from one of the hired firms, who explained that the interactions between Siri and Apple’s users are recorded and sent to workers, who are tasked with listening to and grading the recordings on a variety of factors. The factors include whether the Siri was activated intentionally or if the trigger was a false positive, as well as whether or not Siri was able to provide a helpful answer to the user.
Apple claims the data collected from its users “is used to help Siri and dictation…understand you better and recognize what you say.”
The company went on to provide additional clarification about just how much data is being shared with hired contractors.
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
Apple’s hired contractors are listening to your recorded Siri conversations, too https://t.co/8kmpjxecsh pic.twitter.com/GCbwRsNifZ
— The Verge (@verge) July 26, 2019
Apple also noted that the data used for grading is chosen at random from 1 percent of daily Siri interactions and the recordings are typically less than a few seconds long, but the information provided by the anonymous whistleblower paints a very different picture.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the source detailed.
The source went on to share their concerns about contractors who might misuse the information they’ve been given access to by Apple.
“If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
Earlier this month, Google also faced backlash for similar practices, according to a report from 9t05Google. It seems roughly 0.2 percent of Google Assistant interactions are actually transcribed by hired humans to improve the overall performance of the voice assistant.
Amazon also uses humans to review Alexa interactions.