|
12345678910111213141516171819202122232425 |
- title: Google Home Silently Captures Recordings of Domestic Violence and More
- url: https://threatpost.com/google-home-recordings-domestic-violence/146424/
- hash_url: 96ee2f98aa1d8e683d8a95ac9cc4e078
-
- <p>Google is under fire after a report found that Google Home and Google Assistant records user audio, even when no wake-up word is used.</p>
-
- <p>Google Home smart speakers and the Google Assistant virtual assistant have been caught eavesdropping without permission — capturing and recording highly personal audio of domestic violence, confidential business calls — and even some users asking their smart speakers to play porn on their connected mobile devices.</p>
- <p>In a Wednesday report, Dutch news outlet VRT NWS said it obtained more than one thousand recordings from a Dutch subcontractor who was hired as a “language reviewer” to transcribe recorded audio collected by Google Home and Google Assistant, and help Google better understand the accents used in the language. Out of those one thousand recordings, 153 of the conversations should never have been recorded, as the wake-up command “OK Google” was clearly not given, the report said.</p>
- <p>“VRT NWS was able to listen to more than a thousand excerpts recorded via Google Assistant,” according to the <a href="https://www.vrt.be/vrtnws/en/2019/07/10/google-employees-are-eavesdropping-even-in-flemish-living-rooms/" target="_blank" rel="noopener noreferrer">report</a>. “In these recordings we could clearly hear addresses and other sensitive information. This made it easy for us to find the people involved and confront them with the audio recordings.”</p>
- <p><a href="https://threatpost.com/newsletter-sign/"><img class="aligncenter wp-image-141989 size-full" src="https://media.threatpost.com/wp-content/uploads/sites/103/2019/02/19151457/subscribe2.jpg" alt="" width="700" height="50" /></a></p>
- <p>Google for its part on <a href="https://www.blog.google/products/assistant/more-information-about-our-processes-safeguard-speech-data/" target="_blank" rel="noopener noreferrer">Thursday</a> acknowledged that VRT NWS had obtained authentic recordings, but argued that its language experts only review around 0.2 percent of all audio snippets.</p>
- <p>“As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language,” according to David Monsees, product manager at Google Search. “These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology, and is necessary to creating products like the Google Assistant.”</p>
- <p>Google did acknowledge that some audio may be recorded by Google Home or Google Assistant without the wake-up word being said, due to error.</p>
- <p>“Rarely, devices that have the Google Assistant built in may experience what we call a ‘false accept,'” said Monsees. “This means that there was some noise or words in the background that our software interpreted to be the hotword (like ‘OK Google’). We have a number of protections in place to prevent false accepts from occurring in your home.”</p>
- <p>Google also argued that audio snippets are not associated with user accounts in the review process. Despite that, VRT NWS said, “it doesn’t take a rocket scientist to recover someone’s identity; you simply have to listen carefully to what is being said.”</p>
- <p>While the incident shows that audio is being collected when users expect the devices to be dormant, it also highlights concerns around third-party security and Google’s data retention and sharing policies, given that a subcontractor leaked these recordings to a news outlet.</p>
- <p>Monsees said that said subcontractor is currently under investigation.</p>
- <p>“We just learned that one of these language reviewers has violated our data security policies by leaking confidential Dutch audio data,” he said. “Our security and privacy response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”</p>
- <h2 id="19360" class="rel_link" title="Voice Assistant Data Privacy?">Voice Assistant Data Privacy?</h2>
- <p>The incident comes as voice assistants such as Amazon Alexa and Google Home are coming under increased scrutiny about how much data is being collected, what that data is, how long it’s being retained and who accesses it.</p>
- <p>In April, Amazon was thrust into the spotlight for a similar reason, after a report <a href="https://threatpost.com/amazon-auditors-listen-to-echo-recordings-report-says/143696/" target="_blank" rel="noopener noreferrer">revealed</a> the company employs thousands of auditors to listen to Echo users’ voice recordings. The report found that Amazon reviewers sift through up to 1,000 Alexa audio clips per shift – listening in on everything from mundane conversations to people singing in the shower, and even recordings that are upsetting and potentially criminal, like a child screaming for help or a sexual assault.</p>
- <p>Voice assistants are increasingly being criticized for how they handle private data. In July, Amazon <a href="https://threatpost.com/amazon-admits-alexa-voice-recordings-saved-indefinitely/146225/" target="_blank" rel="noopener noreferrer">came under fire</a> again after acknowledging that it retains the voice recordings and transcripts of customers’ interactions with its Alexa voice assistant indefinitely – raising questions about how long companies should be able to save highly-personal data collected from voice-assistant devices. And last year, Amazon <a href="https://threatpost.com/amazon-1700-alexa-voice-recordings/140201/">inadvertently</a> sent 1,700 audio files containing recordings of Alexa interactions by a customer to a random person –and later characterized it as a “mishap” that came down to one employee’s mistake.</p>
- <p>Security experts for their part said that voice assistants will soon be a big focus for regulators in light of laws like the General Data Protection Regulation (GDPR).</p>
- <p>“I definitely think that we’re going to have, let’s say within the next 15 months, a GDPR ruling on the data-collection policy of home-automation devices,” Tim Mackey, principal security strategist at the cybersecurity research center at Synopsys, told Threatpost. “Voice assistants will probably be high on the list, as would things like video doorbells. And effectively it’s going to be a case of what was disclosed and how was the information processed.”</p>
- <p><strong><em>Don’t miss our free live </em></strong><a href="https://attendee.gotowebinar.com/register/1579496132196807171?source=ART" target="_blank" rel="noopener noreferrer"><strong><em>Threatpost webinar</em></strong></a><strong><em>, “</em></strong><strong><em>Streamlining Patch Management,” on Wed., July 24, at 2:00 p.m. EDT. Please join Threatpost editor Tom Spring and a panel of patch experts as they discuss the latest trends in Patch Management, how to find the right solution for your business and what the biggest challenges are when it comes to deploying a program. </em></strong><a href="https://attendee.gotowebinar.com/register/1579496132196807171?source=ART" target="_blank" rel="noopener noreferrer"><strong><em>Register and Learn More</em></strong></a></p>
|