Whether it’s an iPhone, a product from Samsung or smart speakers from Google, they all have a common vulnerability now. Cybersecurity specialists have managed to control them, remotely, with the help of a laser.
Apple, Samsung and Google have one thing in common: digital assistants. Siri, Bixby and Assistant respectively take orders from you and execute various instructions. But what if someone could control the assistants? That’s what cybersecurity specialists have tried and succeeded. The Japanese-American team used a laser for this.
The idea from which they started is the smart home. If you have enough gadgets and sensors in it, you could control it through a digital assistant.
With the help of a laser designed to reproduce vibrations that sound would naturally generate. Thus, they managed to trick Google Assistant, Siri, Bixby and Amazon Alexa to take orders. And this from a considerable distance. When the laser wave touches the microphone on the device, the microphone picks up the waves and interprets them as voice commands.The worst case scenario was demonstrated: the digital assistant was convinced to open the garage door. The laser beam was designed by screaming into the speaker. And he reproduced the command: “OK Google, open the garage door.” But the problem is not only in smart speakers, such as Home, Portal (Facebook), HomePod or Echo.
The researchers also managed to recreate the experiment on an iPhone Xr, a Google Pixel 2, a Samsung Galaxy S9 and a sixth-generation iPad. The advantage, if that may be the case in this case, is that the phones have smaller microphones. Thus, the distance must be smaller, between five to 20 meters.
Here the vulnerability is both to the assistants, and to the hardware, to the microphones. The laser can make the diaphragm vibrate and that vibration is interpreted as a voice command. Following the experiment, Tesla, Ford, Amazon, Apple and Google were informed of the problem.
Even worse news is that, in another experiment, researchers at Berkeley University in California were able to print out orders for digital assistants in music or what might seem like a conversation. But those commands cannot be heard by the human ear, but they are perceived by the microphones of the devices.