A team of academic researchers has tested the phonetic wherewithal of smart-home assistants Amazon Alexa and Google Home, finding it possible to closely mimic legitimate voice commands in order to carry out nefarious actions.

Prior research also shows that adversaries can generate obfuscated voice commands to spy on users or gather data. DolphinAttack for instance is a method for using completely inaudible ultrasound signals to attack speech-recognition systems and transmit harmful instructions to popular voice assistants like Siri, Google, Cortana, and Alexa. And in November, security firm Armis disclosed that Amazon Echo and Google Home devices are vulnerable to attacks through the over-the-air BlueBorne Bluetooth vulnerability.




Silent Attacks Against Voice Assistants