Critics of “smart assistant” devices such as the Amazon Echo and Google Home often talk about the potential risks of having a device in your house that is always listening to your conversations, and some dismiss this kind of criticism as fear-mongering or unjustified techno-panic. But Amazon has provided even skeptical users with an object lesson in the risks of this new kind of hardware: The company admitted on Thursday that one of its Echo units recorded a conversation and then sent that recording to someone on a user’s contact list.
How could this happen? According to the company, it was the result of a series of misunderstandings between the device and its owners. While the owners were having a conversation, the Echo misunderstood what they were saying and thought it heard the pre-programmed “wake word” — usually the name Alexa — followed by a command to send a message to a friend. The owners only became aware of what had happened when that friend told them he had received a copy of their conversation. Here’s how Amazon described it:
Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
If this sounds wildly improbably to you, you’re not the only one. But the company insists it is what happened, and that this kind of occurrence is extremely rare. But is it? Vanity Fair writer Maya Kosoff writes about how something similar almost happened to her, when her Alexa misunderstood audio from a TV show she was watching as a command to send a message and asked repeatedly “To whom?” Kosoff says she subsequently shut the device off. Others have reported similar events when it appeared that their device misunderstood background noise as a command.
Washington Post writer Geoffrey Fowler pointed out that the Amazon Echo doesn’t have a command to stop listening (although it and Google Home have buttons that can turn off the listening function). One way owners can make themselves aware of what the device is doing, Fowler says, is to go into the settings and see what has already been recorded, since the Echo keeps track of all the times it has been triggered and what it recorded as a result (this is how Amazon diagnosed what happened).
As these kinds of smart devices become more integrated into our lives, it seems obvious that this kind of incident will become more commonplace, even if Amazon says it is a rare accident. In at least one case already, alleged recordings by an Echo became an issue in a murder trial, when the prosecutor subpoenaed audio from the device in a man’s home in the hope that it might have heard something related to the alleged killing. The case was later dismissed, so the Echo recordings were never introduced.