
A recent research has revealed a serious breach in the security of smart assistants like Siri, Google Assistant, or Amazon Alexa. It turns out these devices can take silent commands their owners might not even know of. This raises a series of concerns on the security of the sensitive data these assistants have access to.
Smart speakers can hear silent commands that we can’t hear
Researchers decided to test the security features of different smart assistants. This way, they found out the devices can perceive a series of silent commands that we are incapable of detecting. Of course, these results are extremely worrying for our privacy.
One first research was performed by some student scientists from UC Berkeley and Georgetown University. They could cover these silent commands with white noise and thus communicate with the smart assistants. These commands had them enter websites and switch to airplane mode.
How can white noise cover a spoken command?
White noise is a combination of all sounds our ears can perceive. Therefore, it can prevent us from hearing any other noise incorporated in it. By playing this to the speakers, they can hear the silent commands while we can’t. However, the next step of the research was even more worrying.
The same young researchers performed a new study that showed they could camouflage the silent commands into other audio files. This raises even bigger security concerns. It means that, while you’re listening to a song or a podcast, your smart assistant might be told to make purchases or change security settings.
So far, there’s no evidence that smart assistants have received silent commands outside of a laboratory. However, it’s still possible to do it. There’s only one step left until someone takes over your smart speaker and makes it perform malicious activities. These studies should highlight the need for an improved security system on these devices.
Image source: Pixabay