advertisement
Apple has a privacy concern at hand.
Contractors hired by the Cupertino-based giant have been reported to regularly hear confidential medical information, drug deals, and recordings of couples having sex as a part of quality check for Apple's voice assistant Siri, a report in The Guardian reveals.
Despite Apple not showing it in its consumer-facing privacy documentation, a small part of Siri recordings are accessible to company contractors, the report says.
The contractors are there to grade the responses, including the fact that if the activation was deliberate or accidental.
“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” a company response to The Guardian said.
Apple says that less than one percent of Siri activations are used for grading.
The whistle-blower told The Guardian that there have been countless such instances where recordings featured private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.
Not just this, these recordings are accompanied by user data that shows location, contact details and app data, according to the whistleblower.
In its privacy document, however, Apple has stated that Siri data “is not linked to other data that Apple may have from your use of other Apple services”.
Although Siri is included on most Apple devices, the contractor said the Apple Watch and the HomePod smart speaker were the most frequent source of mistaken recordings.
"Sometimes, you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal… you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch,” the contractor was quoted as saying.
“The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content,” The Guardian quotes.
The contractor also told the daily that the contractors were motivated to go public about their jobs due to fears of user information being misused. He also said that Apple should also reveal the human oversight.
Apple won't be the only company to use humans to oversee inputs for its assistant. Amazon in April, admitted to employing staff to listen to some Alexa recordings, and earlier this month, Google workers were found to be doing the same with Google Assistant.
(With inputs from The Guardian)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)