Activate This ‘Bracelet of Silence,’ and Alexa Can’t Eavesdrop

Last year, Ben Zhao decided to buy an Alexa-enabled Echo speaker for his Chicago home. Mr. Zhao just wanted a digital assistant to play music, but his wife, Heather Zheng, was not enthused. “She freaked out,” he said.

Ms. Zheng characterized her reaction differently. First she objected to having the device in their house, she said. Then, when Mr. Zhao put the Echo in a work space they shared, she made her position perfectly clear:“I said, ‘I don’t want that in the office. Please unplug it. I know the microphone is constantly on.’”

Mr. Zhao and Ms. Zheng are computer science professors at the University of Chicago, and they decided to channel their disagreement into something productive. With the help of an assistant professor, Pedro Lopes, they designed a piece of digital armor: a “bracelet of silence” that will jam the Echo or any other microphones in the vicinity from listening in on the wearer’s conversations.

The bracelet is like an anti-smartwatch, both in its cyberpunk aesthetic and in its purpose of defeating technology. A large, somewhat ungainly white cuff with spiky transducers, the bracelet has 24 speakers that emit ultrasonic signals when the wearer turns it on. The sound is imperceptible to most ears, with the possible exception of young people and dogs, but nearby microphones will detect the high-frequency sound instead of other noises.

“It’s so easy to record these days,” Mr. Lopes said. “This is a useful defense. When you have something private to say, you can activate it in real time. When they play back the recording, the sound is going to be gone.”

During a phone interview, Mr. Lopes turned on the bracelet, resulting in static-like white noise for the listener on the other end.

As American homes are steadily outfitted with recording equipment, the surveillance state has taken on an air of domesticity. Google and Amazon have sold millions of Nest and Ring security cameras, while an estimated one in five American adults now owns a smart speaker. Knocking on someone’s door or chatting in someone’s kitchen now involves the distinct possibility of being recorded.

It all presents new questions of etiquette about whether and how to warn guests that their faces and words could end up on a tech company’s servers, or even in the hands of strangers.

By design, smart speakers have microphones that are always on, listening for so-called wake words like “Alexa,” “Hey, Siri,” or “O.K., Google.” Only after hearing that cue are they supposed to start recording. But contractors hired by device makers to review recordings for quality reasons report hearing clips that were most likely captured unintentionally, including drug deals and sex.

Two Northeastern University researchers, David Choffnes and Daniel Dubois, recently played 120 hours of television for an audience of smart speakers to see what activates the devices. They found that the machines woke up dozens of times and started recording after hearing phrases similar to their wake words.

“People fear that these devices are constantly listening and recording you. They’re not,” Mr. Choffnes said. “But they do wake up and record you at times when they shouldn’t.”

Rick Osterloh, Google’s head of hardware, recently said homeowners should disclose the presence of smart speakers to their guests. “I would, and do, when someone enters into my home, and it’s probably something that the products themselves should try to indicate,” he told the BBC last year.

The “bracelet of silence” is not the first device invented by researchers to stuff up digital assistants’ ears. In 2018, two designers created Project Alias, an appendage that can be placed over a smart speaker to deafen it. But Ms. Zheng argues that a jammer should be portable to protect people as they move through different environments, given that you don’t always know where a microphone is lurking.

At this point, the bracelet is just a prototype. The researchers say that they could manufacture it for as little as $20, and that a handful of investors have asked them about commercializing it.

Source: NY Times

Why It’s Hot

Voice tech spawns voice protection tech. We can assume innovation to protect us from innovation is a trend worth following.

Should you get paid for your data? How much?

In a New York Times opinion piece, Jaron Lanier, a computer programmer and futurist, argues that our data is being robbed from us by social media companies and used for algorithmic advertisements, in what amounts to a “crazy behavioral manipulation scheme.”

His proposed solution is that we should be paid for our data. Services like social media would no longer be free, but individuals would be compensated by commissions on any purchases their data influences.

Why it’s hot: 

As digital advertisers, we have a special window into the power (or powerlessness) of data in influencing behavior. Knowing what you know, what would do you think is worth more: $5,000 or the value of all of the data you have accumulated up to this point?

Facebook’s New Facial Recognition Update + Facial Recognition Blocking Tech

Facebook’s is updating how users can opt in and out of facial recognition. This has been a hot topic online for a few years and Facebook facing a multi million dollar lawsuit about its facial recognition practices is the reason for the change.

Mashable notes “The lawsuit dates back to 2015, but has been slowly progressing — and so far not in Facebook’s favor. The company recently lost an appeal in which it attempted to have the suit dismissed.”

Consumers were allowed to opt in and out of “tagging suggestions” in 2017 but were not told that that came with facial recognition. Facial recognition is being used to target protesters in Hong Kong (and protesters have been attacking facial recognition cameras).

So what do you do in a world where facial recognition is no longer opt-in?

THESE SUPER COOL SHADES

The “phantom” shades reflect light back from infrared cameras but not normal visible light. Fom Mashable: “The frames are specifically designed to defeat 3D dot matrix face-mapping systems, which is basically what makes Apple’s Face ID work. They bounce infrared light back at its source, with the goal of preventing IR video cameras from getting a good image of your face — or potentially even registering your face as a face at all.”

Why It’s Hot?

We are living in a facial recognition world and you are automatically opted into being a facial recognition girl… We’re seeing how facial recognition can be used maliciously in other countries and we are, as a matter of course of being online and on the streets, opted into a system that we did not agree to. The glasses tech is cool but does not speak to the greater issue of what is going on around us and how AI technology might affect us in the coming years.

the foul stench of possible identity theft…

Smell of Data from Leanne Wijnsma on Vimeo.

Some obviously creative innovators have recently created the “Smell of Data” to alert you instantly when your personal information is at risk of being compromised while adventuring around the internet.

Per these geniuses –

“Smell of Data aims to give internet users moment-to-moment updates on whether their private information is at risk of being leaked…The Smell of Data is a new scent developed as an alert mechanism for a more instinctive data…Smell data? Beware of data leaks. They can lead to privacy violation, behavior control, and identity theft.”

“To utilize the Smell of Data, a scent dispenser is charged with the specially developed fragrance, and then connected to a smartphone, tablet, or computer via Wi-Fi. The device is able to detect when a paired system attempts to access an unprotected website on an unsecured network and will emit a pungent puff of the Smell of Data as a warning signal.”

Why It’s Pungent Hot:

It’s seemingly an interesting play on an old method of playing on peoples’ senses in order to condition behavior. While it’s obviously a bit silly, the point is – very often we’re not thinking how what we do digitally could lead to trouble later. Now there’s a Pavlovian way to get us to stop and think.