The vulnerability affects all the devices that work with the voice, which is Amazon Echo speakers, Google Home or Amazon Alexa, Facebook Mini Portal, some smartphones, and the sixth-generation iPad. They warn that the danger is in the linked data, such as the bank card number.


Researchers found that smart speakers could be hacked with a laser.
Researchers found that smart speakers could be hacked with a laser.


A group of researchers discovered a new security risk in digital speakers since they can be manipulated with a laser from a distance of 50 meters instead of the human voice.


In the past, Google, Apple, and Amazon admitted that they listened to users' private conversations through their smart speaker. 



Although it does not have such a device, Facebook also acknowledged that they hired external staff to hear fragments of recordings sent by Messenger.


Now, researchers discovered another danger of having a smart speaker. If a person directs a laser to a device that reacts with voice commands, orders such as opening a garage or making an online purchase can be sent. 


Wired, a technology magazine, released the information on November 4.


This vulnerability affects all the devices that work with the voice, which is Amazon Echo, Google Home, or Amazon Alexa speakers, Facebook Mini Portal, some smartphones, and the sixth-generation iPad.


It was discovered when Takeshi Sugawara, a cybersecurity researcher, pointed a high-power laser at the microphone of his iPad and noticed that a sound was heard in the headphones. This experiment was shown to a professor at the University of Michigan, Kevin Fu, in the spring of 2018.


Both joined a group of researchers at the University of Michigan and traveled to the University of Electro-Lighting in Tokyo, Japan, where they perfected the procedure and discovered how the devices were susceptible to manipulation utilizing the laser.


This way of giving orders to a device is being called "light commands" and works at a distance of up to 50 meters on smart speakers, a maximum of 10 meters on iPhone, and four on an Android.


For this reason, phones would be harder to hack because the attacker would need to be near the device, not for smart speakers. Also, attendees such as Siri only work with the unlocked cell phone or when listening to a recorded voice.


The intelligent speakers, who are more likely, would have to be near the window where the laser can easily pass. And even in those cases, to deactivate alarm systems or activate the remote start of a vehicle requires a voice PIN to operate.


The possible risks are that the device owner does not realize the attack because they use an infrared laser. Which is not visible to the naked eye, and does not listen to the answers because the hacker asks the speaker to lower the volume to zero or activate the "whisper mode. "



The discovery itself is that the microphones of the devices that are capable of receiving voice commands interpret the light as a sound command. The researchers spent several months so that the laser intensity coincided with the frequency of a human voice.


The experts consulted by Wired magazine were clear, even if someone used this technique to hack a smart speaker, the danger is in the linked data. For example, the bank card number.


Google and Amazon told the magazine that they were going to review the research, but their priority is to protect their users. Apple declined to comment, and Facebook did not respond in time for the publication of the article.

Post a Comment

Previous Post Next Post