Apple Apologizes for Using Contractors to Listen in on Private Siri Conversations

George Herman
George Herman
IT Security Expert

Get a FREE scan to check for problems

Some infections like this virus can regenerate themselves. There is no better way to detect, remediate and prevent malware infection, than to use a professional anti-malware software like SpyHunter. One Application that is capable of solving all MAC problems.

Anti-Malware

SpyHunter Anti-Malware FREE 15-day trial available.

The company’s apology comes one month after its promise to end the use of human contractors to listen in on conversations between Siri and users.

In an official statement posted on Apple’s official page, the tech company addresses their goal “to provide the best experience for our customers while vigilantly protecting their privacy.”

Apple also shares that a thorough review has been run, as a result of which the company has decided to make some changes:

“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”

The U.K.-based news outlet The Guardian reveals that Siri can be easily activated by accident, which led to private conversations being picked up, such as drug deals, sexual encounters and people talking to their doctor.

Before Apple, other big tech companies including Facebook, have confirmed that they’re using external contractors. According to a report by Bloomberg, Facebook are listening to audio recordings of users’ private conversations, by taking advantage of the audio-to-text feature in its Messenger app. Only in this case, the affected users had no idea that the task with transcribing conversations was done by humans. This information is not explicit in Facebook’s policies.

What’s worse in Apple’s case, is this report by the Irish Examiner:

“Contractors in Cork [Ireland] were expected to each listen to more than 1,000 recordings from Siri every shift, before Apple suspended the practice last month,” explains the Examiner, who got its information from “an employee who had their contract abruptly terminated this week.”

As a consequence of all that, Apple has announced that it has terminated the contracts of those employed to listen to Siri recordings. It’s been known that the said staff had already been on paid leave since August 2nd, which is when the company paused the practice, due to the raising concerns over user privacy.

So in summary – Apple addresses the issue, apologizes, and says it will stop listening to Siri recordings… unless you give it permission to do so. Interesting.

Leave a Reply

Your email address will not be published. Required fields are marked *