Apple Inc. (NASDAQ: AAPL), Amazon.com Inc. (NASDAQ: AMZN), and Alphabet Inc.’s Google (NASDAQ: GOOG) have all decided to modify the way voice recordings from their digital assistants are reviewed. Many smart-speaker owners didn’t realize that Siri, Alexa, and Google’s Assistant kept recordings of everything they hear after their so-called “wake word” is uttered. Separate reports in the past month revealed how the companies were allowing humans to listen to these recordings, which sometimes included private conversations. All three companies have now taken a different approach to easing the concerns of users.
Apple has gone the farthest by pausing human review of digital assistant recordings entirely across the globe. According to an Apple whistleblower, contractors hired to determine the accuracy of the digital assistant regularly overheard conversations about doctor’s appointments, drug deals, and even couples having sex. The recordings were accompanied by user data showing location, contact details, and app data.
Apple said it would suspend the global analysis of those voice recordings while it reviewed the grading system. The Apple Watch and the HomePod, a smart speaker, appeared to be especially prone to accidental activation. Cat Franklin, an Apple spokeswoman, said in an email, “We are committed to delivering a great Siri experience while protecting user privacy.” Users will be able to opt out of reviews after a future software update.
Google has paused human review of digital assistant recordings in the EU, but had already changed its defaults last year so that its Assistant no longer automatically records what it hears after the prompt “Hey, Google.” Google previously disclosed that some of its contractors listen to recordings of what people say to Google Assistant to help improve support for more languages, accents, and dialects. However, many of the recordings had been accidentally activated by the user and contained personally identifiable data, including addresses, names, and other private information.
Amazon has elected to provide an opt-out setting to Alexa users to prevent their recordings from being reviewed by humans. Earlier language made it appear that unchecking a box turned off uploading of your voice recordings, which wasn’t really the case. The new setting means that these recordings will not be reviewed by humans at all. Amazon will still store recordings of the user’s voice by default. To delete them, a user will need to periodically go into their Alexa settings and remove them themselves.