Amazon, Google smart speakers can be hacked by laser 'light commands': Report

Amazon and Google spokespersons reportedly said that they are reviewing this research paper.


It was recently revealed in research that Alexa and Google Assistant-powered smart speakers can actually trick you into giving away your personal data and passwords. Now another research has come into light that suggests these speakers can be hacked by giving "laser light commands".

As per a report by Wired,  by changing the intensity of the laser beam to a certain frequency, hackers can send commands to the microphone of the smart speakers. These signals are interpreted as normal voice commands for the microphone that make them vulnerable to attackers. The researchers are calling these laser signals as "light commands".

Researchers define light commands as "a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google Assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light."

In the study, it was found that these signals can work when even when the attacker is sitting in a different building!

Amazon, Google smart speakers can be hacked by laser light commands: Report

Representative Image.

This kind of attack was demonstrated by the researchers via a video.

The researcher suggests that smart speaker manufacturers should put a light shield in front of the microphone to prevent this kind of attack.

As per the report, Amazon and Google spokespersons reportedly said that they are reviewing this research paper. Apple reportedly refused to comment and Facebook did not respond before the report was published.

Although setting up specialised equipment to adjust the frequency sounds like a lot of effort and the potential attacker will need technical expertise to orchestrate this entire attack, this does not mean that it is not possible.

Hence, we can say that smart speakers can be exposed to hackers via a laser beam.