As if listening to you have sex wasn’t bad enough, Siri and Alexa can also be hijacked by LASERS, researchers find
Voice-activated digital assistants can be remotely hijacked by lasers as far as 350 feet away and made to order products, start cars, and otherwise drive a smart-home owner crazy, researchers have discovered.
Google Home, Amazon’s Alexa, and Apple’s Siri can be remotely hijacked from hundreds of feet away with lasers pointed at their microphones, researchers at the University of Michigan and the University of Electro-Communications in Tokyo found. The takeover is instantaneous and silent – a well-placed command to turn the device’s volume down to zero would ensure that even its spoken responses could go unnoticed by its hapless owner.
Researchers were able to open garage doors, crack “smart” locks, make online purchases, and even unlock and start vehicles using carefully aimed lasers. Any system connected to the device can be controlled through this relatively simple mode of attack. Because the microphones on voice assistants work by converting sound into electrical signals, encoding the same electrical signal into a laser beam produces an equivalent response to a particular voice command.
Using a telephoto lens to focus the laser, they were able to shanghai devices in other buildings – a Google Home coughed up the time from about 250 feet (75 meters) away with a laser projected diagonally downward at a 21-degree angle. Even with a low-power five-milliwatt laser, a Google Home and early-model Amazon Echo could be ordered around from nearly 360 feet (110 meters) away. The researchers emphasized the accessibility of the setup – any determined device-hacker could throw something together for a few hundred dollars with commercially available parts.
As the high-tech “smart home” is increasingly controlled by voice commands issued through devices like Google Home or Alexa, it becomes enormously susceptible to outside attacks – to say nothing of the surveillance possibilities. The researchers did the ethical thing and warned device manufacturers including Amazon, Apple, Google – even Tesla and Ford, whose cars could be remotely controlled – of the vulnerability, but it’s not the first flaw in these supposedly smart assistants, and it surely won’t be the last.Also on rt.com Google buys Fitbit, acquiring users’ health histories & triggering privacy backlash
In 2017, Chinese researchers found that every voice assistant they tested could be hijacked with high-frequency commands pitched at frequencies over 20,000 Hz that are inaudible to humans. It’s not clear if that vulnerability was ever fixed – filtering out those frequencies may not be possible with current microphone design – but unlike the laser hack, the ultrasonic hack was only possible in close proximity to the device. Another attack uses commands camouflaged in other sounds that are incomprehensible to humans but easily understood by devices.
Meanwhile, even when the devices are working perfectly without interference by sound- and light-hackers, they are still piping owners’ intimate moments to teams of humans, whose official purpose is to evaluate the performance of the artificial intelligence powering the smart speakers. Those teams have infamously been caught swapping entertaining recordings among themselves. Amazon began letting users opt out of human review of Alexa recordings earlier this year, though Apple recently reinstated human review of Siri recordings as the default setting and Google only paused human review in some European countries.
Like this story? Share it with a friend!