- in Acoustics , Gear by Bobby Owsinski
Beware: Alexa And Siri Can Hear Inaudible Commands
Just about everyone has a smart phone these days, and more and more households are purchasing smart speakers. While some love the convenience of voice commands that these smart devices respond to, it does open up a security hole that only now the general public is becoming aware of. It turns out that both Alexa and Siri are susceptible to inaudible commands out of the range of human hearing.
By now everyone has heard about the family in Portland who had their conversation recorded and sent to someone on their contact list without them knowing it. That’s scary enough, since Amazon has since pointed to a series of unintentional commands in the conversation that Alexa misinterpreted. But what happens if it’s intentional?
It turns out that researchers in both the U.S. and China have been looking into this for some time. A group of students from University of California, Berkeley, and Georgetown University showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website. Some of those same researchers recently published a research paper describing how they could embed commands directly into recordings of music or spoken text.
Recently researchers at Princeton University and China’s Zhejiang University showed a voice-recognition system that could be activated using high frequency information beyond human hearing. The technique is called DolphinAttack (see the video below), and first mutes the phone so the owner can’t hear the system’s responses. It then instructs the smart device to do anything from visit a malicious website, make a phone call, take a picture or send a text messages.
The good thing here is that since ultrasonic frequencies are being used, the transmitting device has to be close to the receiving device to work. Still, it’s pretty scary. Check it out below.