The Hacker News
What if your smartphone starts making calls, sending text messages, and browsing malicious websites on the Internet itself without even asking you?

This is no imaginations, as hackers can make this possible using your smartphone's personal assistant like Siri or Google Now.

A team of security researchers from China's Zhejiang University have discovered a clever way of activating your voice recognition systems without speaking a word by exploiting a security vulnerability that is apparently common across all major voice assistants.

DolphinAttack (Demo): How It Works

Dubbed DolphinAttack, the attack technique works by feeding the AI assistants commands in ultrasonic frequencies, which are too high for humans to hear but are perfectly audible to the microphones on your smart devices.

With this technique, cyber criminals can "silently" whisper commands into your smartphones to hijack Siri and Alexa, and could force them to open malicious websites and even your door if you have a smart lock connected.

The attack works on every major voice recognition platforms, affecting every mobile platform including iOS and Android. So, whether you own an iPhone, a Nexus, or a Samsung, your device is at risk.
Cybersecurity

The attack takes advantage of the fact that human ears generally can't hear sounds above 20kHz. But the microphone software still detects signals above 20 kHz frequency.

So, to demonstrate the DolphinAttack, the team first translated human voice commands into ultrasonic frequencies (over 20 kHz), then simply played them back from a regular smartphone equipped with an amplifier, ultrasonic transducer and battery—which costs less than $3.
"DolphinAttack voice commands, though totally inaudible and therefore imperceptible to [a] human, can be received by the audio hardware of devices, and correctly understood by speech recognition systems," the researchers explain in their research paper [PDF].

DolphinAttack Makes Hacking Siri, Alexa & Google Now Easy

ultrasound-command
Since smartphone allows users to do a broad range of operation via voice commands like dialling a phone number, sending short messages, opening a web page, and setting the phone to the airplane mode, the researchers were able to order an iPhone to dial a specific number.

However, according to the researchers, an attacker can send inaudible voice commands to instruct a device to perform several malicious tasks including:

  • Visiting a malicious website—which can launch a drive-by-download attack or exploit the victim's device with 0-day vulnerabilities.
  • Spying—the attacker can instruct the victim's device to initiate outgoing video or phone calls, thereby getting access to the image and sound of device surroundings.
  • Injecting fake information—the attacker can instruct the victim's device to send fake text messages or emails to publish fake online posts or add fake events to a calendar.
  • Denial of Service—the attacker can inject commands to turn on the 'airplane mode,' thereby disconnecting all wireless communications and taking the device offline.
  • Concealing attacks—since the screen display and voice feedback could expose the attacks, the attacker can decrease the odds by dimming the screen and lowering the volume to hide the attack.

Typically, the signal sent out by the researchers was between 25 and 39kHz. As for range, the team managed to make the attack work maximum at 175cm, which is certainly practical.
Cybersecurity

What's scary? DolphinAttack works on just about anything including Siri, Google Assistant, Samsung S Voice, Huawei HiVoice, Cortana, and Alexa, on devices such as smartphones, iPads, MacBooks, Amazon Echo and even an Audi Q3—total 16 devices and 7 systems.

What's even worse? The inaudible voice commands can be accurately "interpreted by the SR [speech recognition] systems on all the tested hardware" and work even if the attacker has no direct access to your device and you have taken all the necessary security precautions.

How to prevent DolphinAttacks?


The team goes on to suggest device manufacturers make some hardware alterations to address this vulnerability simply by programming their devices to ignore commands at 20 kHz or any other voice command at inaudible frequencies.
"A microphone shall be enhanced and designed to suppress any acoustic signals whose frequencies are in the ultrasound range. For instance, the microphone of iPhone 6 Plus can resist to inaudible voice commands well," the researchers say.
For end users, a quick solution to prevent such attacks is turning off voice assistant apps by going into settings, before an official patch lands for your device.

How to disable Siri on iPhone, iPad, or iPod touch: Go to your iOS device's Settings → General → Accessibility → Home Button → Siri and then toggle Allow "Hey Siri" to off.

How to turn off Cortana: Open Cortana on your Windows PC, select the Notebook icon on the right side, click on Settings and then toggle "Hey Cortana" to off.

How to turn off Alexa on Amazon Echo: Simply press the microphone on/off button on the top of the unit. When off, the light will turn red and Echo will stop responding to your wake word until you turn it back on.

How to turn off Google Home: To mute Google Home's mics, press and hold its physical mute button located at the back of the unit.

The team will present their full research at the ACM Conference on Computer and Communications Security in Dallas, Texas next month.

Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.