Published: Fri, May 11, 2018
Sci-tech | By Brandy Patterson

Alexa Can Hear Commands You Can't, Which Hackers Could Exploit

Alexa Can Hear Commands You Can't, Which Hackers Could Exploit

A few months later, the animated series South Park followed up with an entire episode built around voice commands that caused viewers' voice-recognition assistants to parrot adolescent obscenities. The controls, undetectable to the human ear, could prove troublesome if placed in the wrong hands, according to The New York Times. The teams have been able to secretly activate the AI system on smart speakers and smartphones, making them open websites or dial phone numbers in laboratory conditions using different methods.

Researchers in China past year demonstrated that ultrasonic transmissions could trigger popular voice assistants such as Siri or Alexa, in a method known as 'DolphinAttack'.

"We wanted to see if we could make it even more stealthy", said Nicholas Carlini, a fifth-year PhD student in computer security at UC Berkeley and one of the paper's authors. "My assumption is that the malicious people already employ people to do what I do".

While the commands may go unheard by humans, the low-frequency audio commands can be picked up, recovered and then interpreted by speech recognition systems. They were able to hide the command, "OK Google, browse to" in a recording of the spoken sentence, 'Without the dataset, the article is useless.' Humans can not detect the command.

More news: Number of People Infected With Ebola in DR Congo Rises to 32

The Berkeley group also embedded the command in music files, including a four-second clip from Verdi's "Requiem".

These subliminal techniques are created to "exploit the gap" between recognizing speech from humans and machines, the Times notes.

Researchers in China and the United States have discovered that Apple's Siri, Amazon's Alexa, and Google's Assistant can be controlled by hidden commands undetectable to the human ear, The New York Times is reporting.

The researchers are said to have made slight changes to the original audio files to cancel out the sound that speech recognition systems (including Mozilla's open source DeepSpeech voice-to-text translation software) detect and replaced it with a sound that would be transcribed distinctly by machines.

More news: Pro-Trump Signs in Jerusalem: 'Trump, Make Israel Great'

With this in mind, wrongdoers could potentially play music within "earshot" of a Google Home's microphone in order to command it to gain access through your smart door lock.

Apple has additional security features to prevent the HomePod smart speaker from unlocking doors and requires users to provide extra authentication, such as unlocking their iPhone, in order to access sensitive data.

"We want to demonstrate that it's possible", he said, "and then hope that other people will say, 'O.K. this is possible, now let's try and fix it'".

More news: European Leaders Mull Next Steps After US Pulls Out Of Iran Deal

Like this: