Attackers can use audio frequencies beyond human hearing to exploit smart speakers
May 11, 2018, 19:02 IST
Advertisement
- Smart speakers can hear commands inaudible to humans.
- Secret messages can be used to make calls and unlock doors.
- Researchers in China call the technique, 'Dolphin Attack'.
Research teams in China as well as the US have been able to secretly activate the AI systems on smartphones and smart speakers. Once activated, the university labs have dialed phone numbers and opened websites. Nonetheless, the technology could be used for darker ambitions like unlocking doors, wire money or shop online.
Researchers Nicholas Carlini and David Wagner spoke to The New York Times about how they embedded the secret message, “OK Google, browse to evil.com,” in seemingly harmless sentence as well as in a short video of Verdi’s ‘Requiem’. Both times, Mozilla’s open-source software was fooled.
Carlini stated, “We want to demonstrate that it’s possible, and then hope that other people will say, ‘’Okay this is possible, now let’s try and fix it.’”
Researches in China call the technique ‘DolphinAttack’. They’ve used it to instruct smart devices to visit malicious sites, initiate calls, click pictures and send messages. Currently, the scope of the DolphinAttack is limited but the University of Illinois has already demonstrated what ultrasound attacks from 25 feet away are capable of. During the Urabana-Champaign, they showed that though commands couldn’t yet penetrate walls, they still had the potential to control smart devices through open windows in buildings.
Advertisement
In response to the article by the American publication, all three companies have lent assurances that their smart speakers are secure. Specifically with respect to the Apple Homepod, the device has been designed to “prevent commands from doing things like unlocking doors.”
If anyone remembers the Superbowl Amazon ad for earlier this year, it illustrates how Amazon is already aware of such a function. They designed their ad in such a way that Alexa speakers at home with viewers wouldn't respond even though Alexa was uttered nearly 10 different time during the ad spot. In fact, Amazon filed a patent back in 2014 called, "Audible Command Filtering," that describes different techniques to prevent Alexa from waking up.
Security risks have been a continuous concern when it comes to the Internet of Technology (IoT). Nonetheless, this highlights an issue that could pose of an obstacle in the future. It’s a fair warning to companies designing digital assistant to get out in front of the problem rather than be reactionary.