Panic in the tent, so it seemed this week: we are being overheard by the big three tech companies Apple, Google, and Amazon. But the truth seems to be a little more subtle. What is it really like?
It was another hit last week: Apple admitted to overheard in on Siri’s conversations. Previously, Google and Amazon, the creators of Google’s Assistant and Amazon Alexa, were also known to be eavesdropping on users.
The three companies say that they do it purely for quality purposes only. Private information is masked and distorted. Yet (very sometimes) ‘conversations’ are recorded in which people have sex with each other, a medical detail is disclosed or other sensitive information is accidentally shared with the smart assistant. It’s annoying but somehow logical. With millions of voice commands, something can sometimes go wrong. However, let’s put three things first:
1. The assistants always listen but do not record continuously
Siri, Alexa, and Assistant do indeed always listen but do not record everything you say. On the contrary: the assistants only listen to one thing: the awakening word (Hey Siri, Ok Google, Alexa).
The assistant wakes up to hear these words immediately. From that moment on, the recording will take place. This is where the problems begin. Sometimes the Assistant is awakened by accident. And those are the cases that Apple employees listen to because they want to improve the quality of Siri. Even the sound of a zipper could wake Siri up
Anyway, the recordings are sent to the Apple or Google servers and then the assistant sends a reply. These recordings are saved.
2. The eavesdroppers are there for qualitative purposes
Have you ever called customer service? There is a good chance that you have heard the sentence ‘this conversation can be recorded for quality purposes’. The same goes for the assistants. We need people to analyze the wrong answers and reduce the wrong awakenings. The assistants work mainly on the basis of learning artificial intelligence, but a human touch is also needed.
A small part of the failed interactions (less than one percent) is checked by an employee, whereby the personal details of the user are made unclear or invisible. The employee checks what went wrong: was it due to an accent? Was the question different? Or was there too much noise? The findings are then passed on to the developer.
Incidentally, those employees could accidentally meet someone familiar in the recordings, as VRT proved last month, but that is very rare. Also linking that data to a real person is illegal.
3. The tech companies can become much more transparent
The reason for the fuss is simple: the tech companies are far too closed about these practices. It is only in the small print that it can be read that audio data can be shared with, for example, Google when using its assistant.
Not reassured? Then you can also switch off ‘eavesdropping’ in the settings. Incidentally, that is – remarkably enough – not possible with Apple. That company often acts as an advocate of privacy, but conversations can simply be recorded. By the way, Apple does disable all further data sharing.