An investigation revealed on 29 July that Amazon staff had been eavesdropping on private conversations and users having sex as they monitored thousands of Alexa recordings.
Amazon is finally providing its Alexa users with an option to block recordings of them in their private moments, including when having sex.
Amazon is reportedly updating its settings to allow users to opt out.
“For Alexa, we already offer customers the ability to opt out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We’ll also be updating information we provide to customers to make our practices more clear,” Amazon said in a statement on 2 August.
The news comes in the wake of a scandal that erupted last week when The Sun revealed in its own investigation that staff had been eavesdropping when smart speakers picked up private conversations and the more intimate activities of their users.
In order to monitor and improve the Alexa system, the Sun reported, Amazon staff listened to recordings made of British users’ private lives gathered by the Alexa-enabled virtual assistant speakers.
The tapes included recordings of “family rows and couples having sex”.
The publication cited an unnamed former analyst at the English-speaking Amazon team in Bucharest, Romania, as claiming that even though the staff were told to concentrate on Alexa commands, it was “impossible not to hear other things going on”.
Amazon says for Alexa to respond better to commands, a regular analysis of recordings is required to “improve the customer experience”, adding that employees listening to the tapes are bound by strict confidentiality rules.
"This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone. We have strict technical and operational safeguards in place to protect customer privacy, and have a zero tolerance policy for the abuse of our system, “ said the company.
Apple and Google have already suspended the practice of humans reviewing recordings captured by smart speakers and virtual assistants.
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.
Google’s smart speakers are recording users when they least expect it, according to temp worker language experts hired by the company to listen to the snippets – which include some of users’ most private moments. Google is able to claim it does not listen to the recordings Google Home devices are constantly generating only because it contracts the job out to temp workers. These “language experts,” as they are called, use a collaborative system built by the company to share and analyze sound snippets, assisting Google’s AI assistant in deciphering the nuances of human speech.
Amazon last week confirmed that it keeps transcripts of interactions with Alexa, even after users have deleted the voice recordings.
Today in “Amazon’s reasons to listen to everything that’s going on inside your household” news, its being reported that products like the Amazon Echo could one day be used to detect signs of cardiac arrest, according to Bloomberg.
More creepy “Surveillance Capitalism” courtesy of Amazon who isn’t hiding that it is putting millions of smart speakers in homes and hotels by “offering discounted hardware, customized software and new ways for property managers to harvest and use data.” Are there actually people willing to pay for this type of arrangement?
When Alexa runs your home, Amazon tracks you in more ways than you might want.
Our IP Address: