Apple contractors listen to private conversations of users recorded by the voice assistant Siri

Even though voice assistants are becoming increasingly popular, many people have concerns about the privacy of information that reaches developers. This week it became known that contractors testing Apple's voice assistant Siri for accuracy are listening to users' private conversations.

Apple contractors listen to private conversations of users recorded by the voice assistant Siri

The message also stated that in some cases Siri records user speech after erroneous activations. The virtual assistant's wake-up phrase sounds like "hey Siri," but an anonymous source said the recording could be triggered by similar-sounding words or even the sound of thunder. It was also said that in the Apple Watch smart watch, Siri can be automatically activated if the voice assistant hears speech.

“Countless records were collected from private conversations with doctors, business transactions, etc. These records were accompanied by user data revealing location and contact information,” the unnamed source said.

Apple representatives said the company is taking steps to protect users from being linked to records that are shared with contractors. It was said that audio recordings are not associated with Apple ID, and less than 1% of daily Siri activations are verified by developers.

Apple, along with Google and Amazon, have similar policies for contract workers hired to review audio recordings. However, all three companies were caught in similar violations of user data privacy. In addition, technology companies have previously been accused of voice assistants recording user conversations in cases where this should not happen.



Source: 3dnews.ru

Add a comment