Undetectable Commands for Apple's Siri and Amazon's Alexa Raise Serious Security Risks

Adjust Comment Print

The paper authored by U.C. Berkley's Nicholas Carlini and his colleagues is really just the latest in a string for research papers and demos proving the efficacy of these kinds of "subliminal" attacks on smart speakers and other devices with always listening smart assistants. Even virtual assistants such as Alexa, Google Assistant and Siri are not safe from commands that slip by unheard to the human ear. The whole idea creeped a lot of people out, but now we're hearing about something similar where Siri, Alexa, and Google Assistant can all receive inaudible commands that have been hidden in music. In doing so, they've been able to make these systems dial phone numbers, open websites, and more. As mentioned in a report by The New York Times, some researchers have found a way using which they can send audio commands to smart speakers that are undetectable by human ears. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online - simply with music playing over the radio.

The method is dubbed a "dolphin attack", because the water creatures can hear sounds that humans often cannot. Both times, Mozilla's open-source software was fooled.

Carlini went on to note that: "We want to demonstrate that it's possible, and then hope that other people will say, 'Okay this is possible, now let's try and fix it'".

More news: Carvalhal to Leave at the End of the Season
More news: Solo: A Star Wars Story gives Chewbacca his own theme song
More news: Student shot in arm at California school, teen suspect in custody

Hackers might not care about your shopping list, but considering 41.4 percent of smart speakers are in the kitchen, it's important to consider whether they could be used to turn on an oven while you're out, or secretly start up a video call. Researchers at the University of IL demonstrated ultrasound attacks were possible from 25 feet away. The technology is built on the difference in speech recognition by man and smart device.

In a February email to Global News, Amazon explained that purchases made through its smart speakers must be confirmed by customers before being processed. They designed their ad in such a way that Alexa speakers at home with viewers wouldn't respond even though Alexa was uttered almost 10 different time during the ad spot. Apple says the HomePod is programmed to not perform certain tasks such as unlocking a door, while they insist Siri on the iPhone and iPad is safe since the device has to be unlocked in order to execute such commands. In fact, Amazon filed a patent back in 2014 called, "Audible Command Filtering", that describes different techniques to prevent Alexa from waking up.

One of the first things Google showed off at I/O this year was a handful of new voices for Google Assistant and they're now available.

Comments