Researchers from the Ruhr-Universität Bochum (RUB) and the Bochum Max Planck Institute (MPI) for cyber security and privacy investigated what words inadvertently trigger voice assistants.

Researchers from Germany’s Ruhr-Universität Bochum and the Bochum Max Planck Institute reveal that Siri  can be inadvertently activated by other commands, including ” A city “, ” Hey Jerry ” and more, reveal

Siri isn’t the only virtual assistant that activates with fake triggers. The study compiled a list of over 1,000 words that can accidentally activate several virtual assistants.In addition to Siri, Amazon’s Alexa, Google Assistant, Microsoft Cortana and others are also mistakenly activated simply by listening to some movies or TV shows.

Alexa, for example, is activated with the words “unacceptable” and  “election”,  while Google Assistant can be activated with the ” OK, cool” command .

Doctor Kolossa said:

“The devices are intentionally programmed to be a little more tolerant of receiving commands, so you can better understand humans. Therefore, they are more likely to boot once more rather than not at all. “

From a privacy standpoint, the problem with this type of accidental wake-up words is that it can involve sending audio to smart assistant manufacturers when not expected by users.

“From a privacy perspective, false triggers are obviously alarming, because sometimes private conversations can be heard by strangers,” said Thorsten Holz, another researcher on the project. “From an engineering point of view, however, this approach is quite understandable, since systems can only be improved by using that data. Manufacturers need to strike a balance between data protection and technical optimization. “

Have you ever activated Siri with a command other than “Hey, Siri”? Let us know in the comments below!

Subscribe To Our Tech News & Newsletters

Join our mailing list to receive the latest tech news and updates from our team.

You have Successfully Subscribed!

Share This