Undetectable Commands for Apple’s Siri and Amazon’s Alexa Raise Serious Security Risks

Researchers in the U.S. and China have discovered ways to send hidden commands to digital assistants—including Apple’s Siri, Amazon’s Alexa, and Google’s Assistant—that could have massive security implications.

Over the last two years, researchers in China and the United States have begun demonstrating that they can send hidden commands that are undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university labs, the researchers have been able to secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doorswire money or buy stuff online — simply with music playing over the radio.

This month, some of those Berkeley researchers published a research paper that went further, saying they could embed commands directly into recordings of music or spoken text. So while a human listener hears someone talking or an orchestra playing, Amazon’s Echo speaker might hear an instruction to add something to your shopping list.

“My assumption is that the malicious people already employ people to do what I do,” said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper’s authors.

Last year, researchers at Princeton University and China’s Zhejiang University also found voice-activated devices could be issued orders using inaudible frequencies. Chinese researchers called the technique DolphinAttack.

 

Amazon told The New York Times it has taken steps to ensure its speaker is secure. Google said its platform has features that mitigate such commands. And Apple noted an iPhone or iPad must be unlocked before Siri will open an app.

Still, there are several examples of companies taking advantage of weaknesses in the devices, from Burger King’s Google Home commercial to South Park‘s stunt with Alexa.

And the number of devices in consumers’ homes is on the rise. Digital assistants have been among the hottest gifts of the past two holiday seasons. And Amazon, alone, is expected to sell $10 billion worth of the devices by 2020.

Source: NY Times and Fortune

Why It’s Hot

It seems like every week we are posting something else about Voice (Alexa, Google Home) and emerging capabilities or how brands are using them. As with any tech, there are concerns about how it will be used. I do wonder though if there’s something positive here, versus scary?

Spotify is testing a new voice search feature

Spotify is testing a voice search feature that lets users more quickly access their favorite artists, tracks, albums, and playlists. The feature, which appears based on a 2017 experiment involving a “driving mode,” has begun appearing inside the iOS app for a small number of users.

To access the new voice search feature, you tap the magnifying glass icon at the center of the bottom row of tabs. If you have it, you’ll see a microphone icon inside a white bubble in the lower-right hand corner of the screen.

So far, voice control appears limited to finding music inside inside Spotify’s vast catalog. Ask it “Who are the Beatles?” and it will start playing a Beatles playlist without telling you anything about the band.

Why it’s hot: This is a great step forward for navigation in app that has sometimes requires too much tapping and typing to get where you’re going.

Source: The Verge

dragon drive: jarvis for your car…

The wave of magical CES 2018 innovations has begun to roll in, and among those already announced is a company called Nuance Communications’s “Dragon Drive” – an (extremely) artificially intelligent assistant for your car.

According to Digital Trends

“By combining conversational artificial intelligence with a number of nonverbal cues, Dragon Drive helps you talk to your car as though you were talking to a person. For example, the AI platform now boasts gaze detection, which allows drivers to get information about and interact with objects and places outside of the car simply by looking at them and asking Dragon Drive for details. If you drive past a restaurant, you can simply focus your gaze at said establishment and say, “Call that restaurant,” or “How is that restaurant rated?” Dragon Drive provides a “meaningful, human-like response.”

Moreover, the platform enables better communication with a whole host of virtual assistants, including smart home devices and other popular AI platforms. In this way, Dragon Drive claims, drivers will be able to manage a host of tasks all from their cars, whether it’s setting their home heating system or transferring money between bank accounts.

Dragon Drive’s AI integration does not only apply to external factors, but to components within the car as well. For instance, if you ask the AI platform to find parking, Dragon Drive will take into consideration whether or not your windshield wipers are on to determine whether it ought to direct you to a covered parking area to avoid the rain. And if you tell Dragon Drive you’re cold, the system will automatically adjust the car’s climate (but only in your area, keeping other passengers comfortable).

Why It’s Hot:

Putting aside the question of how many AI assistants we might have in our connected future, what was really interesting to see was the integration of voice and eye tracking biometrics. Things like using your voice as your key (/to personalize your settings to you and your passengers), the car reminding you of memories that happened at locations you’re passing, and identifying stores/buildings/restaurants/other things along your route with just a gaze, it’s amazing to think what the future holds when all the technologies we’ve only just seen emerging in recent years converge.

[More info]

Amazon’s Alexa may eventually serve up ads…maybe, maybe not?

It was only a matter of time, folks.

According to a report from CNBC, Amazon is in talks with brands and advertisers to include ads on the Echo through via Alexa. The report says that Amazon is discussing these opportunities with Procter & Gamble and Clorox.

Just as ads found their way to the newspaper, the radio, the television, the internet, and even to our inbox and inside our apps, it only makes sense for advertisers to follow us to the next frontier of voice-powered AI.

There are two obvious paths to potentially advertising on Alexa.

The first is to let brands pay for placement when users are shopping through Alexa. For example, Proctor & Gamble could pay for Bounty to be the first brand recommended when a user asks for Alexa to purchase paper towels. Of course, these ads could be ultra-smart given the data Amazon already has about each individual user’s buying history.

The second channel for advertising could come via Alexa Skills. For example, a skill that tells users movie showtimes could suggest buying tickets through Fandango.

Paid search ads via voice could be much more effective than the paid search ads you see on the web, as with Google. On the web, many have grown numb to ad search results and can easily scroll past them to real search results. On a voice platform, it takes far more work to ‘scroll past’ the first result presented. Plus, depending on how Amazon presents paid results, it may be more difficult to decipher paid results from actual results.

Amazon, however, responded to CNBC saying that “the company has no plans to add advertisements to Alexa.” Obviously, this is just a rumor at the moment but it would be far from shocking if ads hit the Alexa platform. An Amazon spokesperson responded to request for comment with the same quote they gave CNBC: “There are no plan to add advertising to Alexa.”

Source: TechCrunch

Why It’s Hot

Regardless of whether this is real news now or not, it’s still interesting to consider and potentially inevitable. Brands are bound to want in on this expanding space — can the Amazons and Google’s of the world hold them back? Should they?