SafeUM
Home Blog Services Download Help About Recharge
EN
RU

Axarhöfði 14, 110 Reykjavik, Iceland

Iceland - 2015
SafeUM
Blog
Services
Download
Help
About
Recharge
Menu
EN
Lang
EN
RU
Archive
TOP Security!
15 Oct 2015

Hackers can silently control Siri and Google Now

Siri may be your personal assistant. But your voice is not the only one she listens to. As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.

A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack.

Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone. Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.

“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”

The researchers’ work, which was first presented at the Hack in Paris conference over the summer but received little notice outside of a few French websites, uses a relatively simple collection of equipment: It generates its electromagnetic waves with a laptop running the open-source software GNU Radio, a USRP software-defined radio, an amplifier, and an antenna. In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack’s range to more than 16 feet.

Here’s a video showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website. (That experiment was performed inside a radio-wave-blocking Faraday cage, the researchers say, to abide by French regulations that forbid broadcasting certain electromagnetic frequencies. But Kasmi and Esteves say that the Faraday cage wasn’t necessary for the attack to work.)

The researchers’ silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. (On iPhones, however, Siri is enabled from the lockscreen by default, with no such voice identity feature.) Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.

Then again, the researchers contend that a hacker could hide the radio device inside a backpack in a crowded area and use it to transmit voice commands to all the surrounding phones, many of which might be vulnerable and hidden in victims’ pockets or purses. “You could imagine a bar or an airport where there are lots of people,” says Strubel. “Sending out some electromagnetic waves could cause a lot of smartphones to call a paid number and generate cash.”

Although the latest version of iOS now has a hands-free feature that allows iPhone owners to send voice commands merely by saying “Hey Siri,” Kasmi and Esteves say that their attack works on older versions of the operating system, too. iPhone headphones have long had a button on their cord that allows the user to enable Siri with a long press. By reverse engineering and spoofing the electrical signal of that button press, their radio attack can trigger Siri from the lockscreen without any interaction from the user. “It’s not mandatory to have an always-on voice interface,” says Kasmi. “It doesn’t make the phone more vulnerable, it just makes the attack less complex.”

Of course, security conscious smartphone users probably already know that leaving Siri or Google Now enabled on their phone’s login screen represents a security risk. At least in Apple’s case, anyone who gets hands-on access to the device has long been able to use those voice command features to squeeze sensitive information out of the phone—from contacts to recent calls—or even hijack social media accounts. But the radio attack extends the range and stealth of that intrusion, making it all the more important for users to disable the voice command functions from their lock screen.

The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack. But they note that their attack could also be prevented in software, too, by letting users create their own custom “wake” words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands. Neither Google nor Apple has yet responded to inquiry about the ANSSI research.

Without the security features Kasmi and Esteves recommend, any smartphone’s voice features could represent a security liability—whether from an attacker with the phone in hand or one that’s hidden in the next room. “To use a phone’s keyboard you need to enter a PIN code. But the voice interface is listening all the time with no authentication,” says Strubel. “That’s the main issue here and the goal of this paper: to point out these failings in the security model.”

Tags:
hackers Siri information leaks Google Now
Source:
Wired
2030
Other NEWS
3 Jul 2020 safeum news imgage An encrypted messaging service has been infiltrated by police
4 May 2020 safeum news imgage Two-Factor Authentication ​What Is It and Why You Should Use It
12 Dec 2019 safeum news imgage Encryption is under threat - this is how it affects you
4 Nov 2019 safeum news imgage Should Big Decisions Be Based on Data or Your Intuition?
7 Jun 2018 safeum news imgage VPNFilter malware infecting 500,000 devices is worse than we thought
4 Jun 2018 safeum news imgage Hackers target Booking.com in criminal bid to steal hundreds of thousands from customers
1 Jun 2018 safeum news imgage Operator of World's Top Internet Hub Sues German Spy Agency
30 May 2018 safeum news imgage US says North Korea behind malware attacks
29 May 2018 safeum news imgage Facebook and Google targeted as first GDPR complaints filed
25 May 2018 safeum news imgage A new reason to not buy these cheap Android devices
24 May 2018 safeum news imgage Flaws in smart pet devices, apps could come back to bite owners
23 May 2018 safeum news imgage Google sued for 'clandestine tracking' of 4.4m UK iPhone users' browsing data
21 May 2018 safeum news imgage LocationSmart reportedly leaked phone location data onto the web
18 May 2018 safeum news imgage The SEC created its own scammy ICO to teach investors a lesson
17 May 2018 safeum news imgage Thieves suck millions out of Mexican banks in transfer heist
All news
SafeUM
Confidential Terms of Use Our technologies Company
Follow us
Download
SafeUM © Safe Universal Messenger

Axarhöfði 14,
110 Reykjavik, Iceland

Iceland - 2015