Apple contractors working on Siri hear recordings of private information in a regular basis, a new report says

All of the major voice assistants — including Apple’s Siri, Amazon’s Alexa, and Google’s Assistant — are not as private as you might think.

In order to train and improve virtual assistants like Alexa, the companies behind those services tend to have employees or contractors manually review clips of conversations, for the sake of quality control.

A new report from The Guardian’s Alex Hern on Friday sheds more light on how Siri actually works. Hern spoke with an anonymous contractor who performs quality control on Siri who said they were concerned about how often Siri tends to pick up “extremely sensitive personal information.”

According to The Guardian’s source, contractors working on Siri “regularly” hear recordings of people having sex, business deals, doctors and patients having private medical discussions, and even people conducting drug deals with each other.

Whether or not Siri’s activation was intentional — and often it’s not, as the anonymous contractor said Siri can think the “zip” sound is a trigger — these Apple contractors are responsible for grading Siri’s responses. They do note whether the activation was accidental among other factors, like whether or not Siri’s answer was appropriate or helpful, or whether the question posed to Siri is something it should be expected to do.

We reached out to Apple about the issue, but we’ve yet to hear back. The company told The Guardian that “less than 1%” of daily Siri activations are looked at for review, and that no Siri requests are associated with Apple IDs “under the obligation to adhere to Apple’s strict confidentiality requirements.” But the whistleblower told The Guardian that Siri recordings “are accompanied by user data showing location, contact details, and app data,” which Apple might use to know whether or not a request was acted on.

The report raises significant privacy concerns about the process. According to the anonymous contractor who came forward, Siri’s quality-control workers could misuse this private information because there are “no specific procedures to deal with sensitive recordings.”

“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the contractor told The Guardian. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names, and so on … It’s not like people are being encouraged to have consideration for people’s privacy.”

According to Apple, the company saves Siri voice recordings for six months at a time; after that, Apple saves a copy of the data without any kind of identifier for up to two years, for the sake of improving Siri’s performance and accuracy. Some companies collect more identifying information — like when you use Alexa, for instance — but compared with Amazon and Google, Apple does not let users opt out for how their recordings might be used, unless you turn off Siri entirely.

You can read the full report over at The Guardian.

Give it a share: