Apple Will No Longer Record & Listen to Your Siri Conversations Unless You Opt-In


By

on

in

, ,

Today Apple announced that they will no longer listen to your Siri conversation son the Apple TV, iPhone, iPads, etc.  This comes after reports of Apple employees listening to Siri recordings popped up in the news.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.” Apple said in a statement on its website.

Apple went on to say: “Siri has been engineered to protect user privacy from the beginning. We focus on doing as much on device as possible, minimizing the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private.”

Amazon has also announced that they now let you opt-out of people listening to your Alexa recordings as news reports popped up that Amazon had a team of employees checking Alexa recordings to see if they properly answered your question.

Here are the changes Appel is making today:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Disclaimer: To address the growing use of ad blockers we now use affiliate links to sites like http://Amazon.com, streaming services, and others. Affiliate links help sites like Cord Cutters News, stay open. Affiliate links cost you nothing but help me support my family. We do not allow paid reviews on this site. As an Amazon Associate I earn from qualifying purchases.

Subscribe to Our Newsletter

* indicates required

Please select all the ways you would like to hear from :

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp’s privacy practices here.