Google Assistant

Google employees allegedly listening to Google Assistant Conversations

Google employees are allegedly listening-in on all your conversations, including ones which are not meant to be recorded. This happens via the company’s AI-powered Google Assistant on phones and Google Home speakers.

Google, in its terms and conditions, states everything said by users to their Google smart speakers and Google Assistant is being recorded and stored. However, it doesn’t mention its employees can listen to excerpts from these recordings.

Google has accepted that it listens to the conversations and has provided an in-depth explanation of what it does.

David Monsees, Product Manager, Search, said in a statement.

As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language. These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant. 

The company says that Google Assistant sends back audio only after a device detects the user as saying ‘Hey Google’, or physically triggering the Google Assistant, to interact with it.

When the assistant is summoned, the device is said to provide a clear indicator (flashing dots on top of a Google Home or an on-screen indicator in case of an Android device) that a conversation has started.

However leaked recordings, show that since users didn’t call up the virtual assistant, multiple sensitive user conversations were reportedly recorded unintentionally. These include bedroom conversations, chats between parents and their children, along with phone calls containing a slew of sensitive private information.

In addition, there is said to be a woman’s recording who was apparently under definite distress.

Monsees said,

We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.

Google clarifies that “rarely”, Google Assistant built devices may experience a “false accept.” It means that either some noise or words in the background was interpreted to be the hotword to invoke the assistant. Google says that it also has “a number of protections” in place to prevent false accepts.

“We apply a wide range of safeguards to protect user privacy throughout the entire review process. Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google,” 

Google reportedly tries to ensure that the voice excerpts are not being linked to a user account to make it difficult to track someone’s identity. The company is said to delete the user name and replace it with an anonymous serial number.