DolphinAttack: The Most Famous Voice Assistant Siri And Alexa Vulnerability (Can Be Hacked)
DolphinAttack: The Most Famous Voice Assistant Siri And Alexa Vulnerability (Can Be Hacked)
A group of security researchers from the Zhejiang University in China have proved how several popular speech recognition systems can be controlled using ultrasound via an attack method they have called “DolphinAttack.”
Voice assistant apps are not as reliable as we deemed them to be. According to a group of researchers from Zhejiang University, the most popular of all voice assistants namely Siri and Alexa both contain a critical flaw. It must be noted that these apps cover almost every mobile platform including iOS and Android. Therefore, whether you carry an iPhone or a Nexus, your device is at risk.
The team of researchers managed to translate typical audio commands into ultrasonic frequencies using the DolphinAttack method. These frequencies are much higher than the normal range audible by human ear; however, microphones and the voice assistant powering software can decipher them. It is a highly simple translation process, but through this technique, it is possible to control any gadget with words that are uttered in frequencies we cannot hear.
According to researchers:
“This paper aims at examining the feasibility of the attacks that are difficult to detect, and the paper is driven by the following key questions: Can voice commands be inaudible to human while still being audible to devices and intelligible to speech recognition systems? Can injecting a sequence of inaudible voice commands lead to unnoticed security breaches to the voice controllable systems? To answer these questions, we designed DolphinAttack”
The researchers tested the attack (DolphinAttack) on 16 devices with 7 different speech recognition systems, it was successful in all situations from different distances.
“By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile. We propose hardware and software defense solutions,”
This attack may not be realistic, but it shows the several ways by which attackers can try to hack our devices.
What performs the attack scary is the evidence that it works on just about anything: Siri, Google Assistant, Samsung S Voice and Alexa, on gadgets like smartphones, iPads, MacBooks, Amazon Echo and also an Audi Q3 16 tech products and seven system in total. What’s worse, “the muffled voice commands can be accurately interpreted by the SR speech recognition operations on all the tested hardware.” Suffice to say, it runs even if the attacker has no device way and the owner has taken the basic security precautions.
The research team stated that the commands could be anything from a simple command like “Hey Siri” to forcing the iPhone to open a malicious website or even asking Nexus 7 to call on “1234567890”, or Amazon’s Echo could be asked to “open the backdoor.” Attackers can easily push an Audi Q3 to change its navigational system settings and add a new location. The consequences are various and truly diverse, and not to forget dangerous as well. That’s because adversaries can try to manipulate the software of voice assistant in any possible way. It is safe to say that the seemingly harmless, human-friendly UI programs have huge security repercussions.
So, for a hack to work, you’d have your assistant unlocked and not really be paying thought, a fairly unlikely scenario. However, if you’re in a public place with your phone unlocked, a nearby attacker could probably gain access
Post a Comment