If the Alexa speaker listens in, then Amazon employees also do so to some extent. But are users sufficiently informed about this?
The US corporation selects some voice recordings for recording in order to further develop speech recognition. The group confirmed the approach. “We only annotate an extremely small number of interactions from a random group of customers to improve the user experience.” According to Bloomberg, employees are evaluating Alexa records at various locations around the world, including Boston, Costa Rica, India and Romania.
According to two employees in Bucharest, they turn over as many as 1,000 recordings each per shift there. One Boston employee said he analyzed recordings with the words “Taylor Swift,” for example, and annotated them to show that users meant the singer.
“As part of this workflow, employees do not have direct access to information that can identify the individual or account,” Amazon stressed. The financial service reported at the same time that a screenshot of one such order to transcribe Alexa records had listed an account number, the user’s first name and the device’s serial number. The company stated that all information is kept strictly confidential. In addition, they work with access restrictions and encryption.
Also of interest: what to do if you receive an Amazon package you didn’t order
Customers can have Alexa recordings deleted
Amazon’s public information about Alexa does not yet explicitly state that, under certain circumstances, people could also listen to the recordings. “For example, we use your commands to Alexa to train our speech recognition and natural language understanding systems,” it says generally in questions and answers on an Amazon page. At the same time, users can opt out of having their recordings used to further develop the service, as well as delete previous recordings, in the settings.
A woman sings in the shower, sexual assault, a child calls for help
Employees would have the task of listening to the voice commands and checking to see if Alexa had recognized the words correctly, Bloomberg wrote. In other cases, they listen to a conversation with the software to see how well it interacted with the user.
Also of interest: how to find spy apps on your phone
When transcribing Alexa recordings, employees would have heard confidential information such as names or bank account information, Bloomberg reported. In those cases, they were told to check a box in the “critical data” menu item and move on to the next recording. Bloomberg also cited other examples of private recordings heard by employees. For example, a woman singing in the shower or a child calling for help. Two employees told Bloomberg they also heard what could have been a sexual assault. According to two employees from Romania, they were urged not to do anything in such situations, the financial service wrote.
Accidental recordings are also typed up
Devices with the assistance software, such as Amazon’s Echo speakers, only start recording when they hear the specified wake-up word, such as “Alexa”. At the same time, it happens from time to time that the function is activated by mistake. This is because the software sometimes thinks it has heard the activation word. The employees also type accidental recordings, Bloomberg reported. According to the employees, up to 100 such recordings are worked through per day.
Also of interest: how to maintain your privacy with Amazon Alexa and Google Assistant
There was initially no further information from Amazon on Thursday. Competitors Apple and Google also initially did not comment on the request, whether they resort to a similar approach for their assistants Siri and Google Assistant.