Amazon Alexa user receives 1,700 audio recordings of a stranger through 'human error'
When a person using Amazon.com's voice assistant in Germany requested to listen to his archive of recordings, he got much more than he was expecting.
Along with receiving his own audio history captured by a home microphone the user also gained access to 1,700 audio files from a person he did not know.
Amazon sent the man a link that contained a stranger's recordings, allowing him to listen to another man speaking inside of his home with a female companion, Reuters reported Tuesday, citing a German trade magazine, c't.
"This was an unfortunate case of human error and an isolated incident," Amazon told The Washington Post in a statement Thursday. "We have resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities."
The first man notified Amazon of the improperly shared recordings, according to the report. Amazon deleted the files from the link the company had accidentally shared with him. But the violation of privacy had already transpired: After Amazon had sent the user the link, he downloaded the audio recordings of the stranger to his computer.
The incident in Germany follows a widely covered Alexa privacy mishap that occurred much closer to Amazon's home. Earlier this year, a family in Portland, Oregon, discovered that its Alexa-powered Echo device had recorded their private conversation and sent it to a random person in their contacts list. The disturbing event, first reported by Washington state's KIRO 7, went viral, highlighting the risks of keeping an always-on, Internet-connected microphone in someone's most intimate spaces. At the time, Amazon described the chain of events as rare and said that "we are evaluating options to make this case even less likely."
The mistake also drew attention to the uncanny readiness with which American consumers have accepted microphone-linked voice assistants into their homes and how the devices actually work.
As The Post has written, voice-based devices like Amazon's Echo and Google Home are always "awake," passively listening for commands to activate. A user can mute the devices, and they can also listen to past recordings and delete them. Google and Amazon keep a copy of every conversation.
Amazon has not disclosed how many times users have improperly been granted access to another person's recordings. Amazon said the company apologized to the person who received the audio files and to the person whose conversations were accidentally shared.
Amazon's chief executive, Jeff Bezos, owns The Washington Post.