Google admitted on Thursday that "language experts" hired by the firm they listen to approximately 0.2% of the conversations that users have with their virtual assistant, which implies that a part of those interactions are not completely private.
The usual assumption, and what often reiterate the companies that manage virtual assistants like Amazon, Samsung and Apple, in addition to Google, is that the conversations between a user and his assistant are entirely private and that the interaction occurs exclusively through artificial intelligence, that is, the only ones that "listen" to the user are robots.
However, Google's admission on Thursday that 0.2% of these conversations are heard by human beings to ensure, improve the quality of service It sheds light on a practice that companies usually avoid advertising, although it is known within the industry that to a lesser or greater extent, it is commonplace.
The revelation came from the hand of search product manager Californian company David Monsees, who posted an entry in the official blog of Google in response to an information appeared yesterday on Belgian television VRT NWS, which could access about a thousand recordings of anonymous individuals.
The recordings were provided to Belgian television in the Dutch language by one of the "experts" that Google had hired in that country to listen to segments of the conversations and "understand the particularities and accents of each specific language".
We are heard from all over the world
The firm, which has already announced that it will "take action" for the leak, considering it a "violation" of its data security policies, admitted to having "experts all over the world" whose function is to listen and transcribe "a small part of the dialogues to help us better understand those languages".
In particular, the firm of Mountain View (California, USA) encrypted the percentage of interactions analyzed by humans at 0.2% and guaranteed that these fragments are not associated with user accounts and that experts are tells them not to transcribe sounds or conversations in the background that are not directed to Google.
However, Belgian television was able to identify "postal addresses and other sensitive information" in the recordings, which allowed them to contact the people whose voice had been recorded and confirm that it was indeed them.
"A couple from Waasmunster (Belgium) immediately recognized the voice of their son and grandson", they gave as an example from VRT NWS.
Google indicated that the virtual assistant only sends audio recordings once it has detected that the user is interacting with him after having said, for example, "Hey, Google" and that he has several tools to avoid "false activations", that is, that the software interprets a sound erroneously as the Keyword to activate.
Despite this, VRT NWS published that of the around one thousand voice fragments to which it had access (all of them in the Dutch language), 153 were conversations in which nobody gave the activation order to the virtual assistant, but interpreted it erroneously a sound.
. (tagsToTranslate) Espionage (t) Google