•   
  •   

Tech & Science Voice assistants could be fooled by commands you can’t even hear

11:15  14 may  2018
11:15  14 may  2018 Source:   msn.com

Spurs, Celtics assistants ‘gathering momentum’ in Hornets coach search, report says

  Spurs, Celtics assistants ‘gathering momentum’ in Hornets coach search, report says Charlotte has reportedly become increasingly interested San Antonio assistant James Borrego and Boston assistant Jay Larranaga.Spurs assistant James Borrego and Celtics assistant Jay Larranaga are each gathering momentum in Charlotte's search for a head coach, league sources tell ESPN.

Many people already consider voice assistants to be too invasive to let them listen in on conversations in their homes — but that’s not the only thing they One method called DolphinAttack even muted the target phone before issuing inaudible commands , so the owner wouldn’ t hear the device’s responses.

Go check it out. Voice assistants could be fooled by commands One method called DolphinAttack even muted the target phone before issuing inaudible Voice -controlled assistants by Amazon, Apple and Google could be hijacked by ultrasonic audio commands that humans cannot hear

a close up of a light © Provided by The Next Web Many people already consider voice assistants to be too invasive to let them listen in on conversations in their homes — but that’s not the only thing they should worry about. Researchers from the University of California, Berkeley, want you to know that they might be also be vulnerable to attacks that you’ll never hear coming.

In a new paper (PDF), Nicholas Carlini and David Wagner describe a method to imperceptibly modify an audio file so as to deliver a secret command; the embedded instruction is inaudible to the human ear, so there’s no easy way of telling when Alexa might be asked by a hacker to add an item to your Amazon shopping cart, or worse.

New Google technology promises to carry out "lifelike" conversations

  New Google technology promises to carry out Experimental technology, rolling out soon in a limited release, makes you think you’re talking to a real personIn a building called the Partnerplex on Google's sprawling campus in Mountain View, California, I've been invited to hear a 51-second phone recording of someone making a dinner reservation.

Humans cannot discern the command . The Berkeley group also embedded the command in Alexa and Siri can hear this hidden command , but you can ’ t Computers can be fooled into identifying an airplane as a cat just by changing a few pixels of a Both companies’ assistants employ voice

Go check it out. Voice assistants could be fooled by commands One method called DolphinAttack even muted the target phone before issuing inaudible Voice -controlled assistants by Amazon, Apple and Google could be hijacked by ultrasonic audio commands that humans cannot hear .

To demonstrate this, Carlini hid the message, “OK Google, browse to evil.com,” in a seemingly innocuous sentence, as well as in a short clip of Verdi’s ‘Requiem,’ which fooled Mozilla’s open-source DeepSpeech transcription software.

Speaking to The New York Times, Carlini – who, in 2016, demonstrated how he and his team could embed commands in white noise played along with other audio to get voice-activated devices to do things like turn on airplane mode – said that while such attacks haven’t yet been reported, it’s possible that “malicious people already employ people to do what I do.”

Thanks for the cheerful thought, Nicholas.

There have been other (unfortunately successful) attempts to fool voice assistants, and there aren’t a lot of ways to counter such audio from being broadcasted to target people’s ‘smart’ devices. One method called DolphinAttack even muted the target phone before issuing inaudible commands, so the owner wouldn’t hear the device’s responses.

We need hardware makers and AI developers to tackle such subliminal messages, particularly for devices that don’t have screens to give users visual feedback and warnings about having received secret commands. In demonstrating what’s possible with this method, Carlini’s goal is to encourage companies to secure their products and services so users are protected from inaudible attacks.

Let’s hope Google, Amazon, Apple, and Microsoft are listening.

Baffling new audio illusion divides social media - so is the little toy saying 'brainstorm' or 'green needle'? .
There's a new auditory illusion - and it's enough to freak you out! New footage has sent the internet into meltdown - so can you hear 'brainstorm' or 'green needle?' or both?The footage shows a light-up toy saying two different phases in a robotic voice.

—   Share news in the SOC. Networks

Topical videos:

This is interesting!