Turning the camera on Big Brother

NYTimes is analyzing the music used by candidates at their rallies. Music is a powerful emotional signal, transmitting a message deep into the emotional brain where we feel connection. So, it is helpful to know how candidates are using this psychological messaging tool to reach voters.

Extrapolating meaning from data sets, such as song tracks used in political campaigns, can bring to light information that would otherwise not emerge. The growing trend of using big data to help us understand and manipulate the world may be coming more into the hands of the public.

Why it’s hot: Knowledge is power. Whoever has the data and the processing power, has the knowledge and can learn things about the world that no one would have discovered otherwise. Primarily this power has been with brands and governments. But what if more of that power came into the hands of the people? This article points to a possible future where open-source data mining could help us learn things about governments and companies that could level the playing field in the war over territory in our collective consciousness.

If big data was in the hands of the people, what would we do with it? How would it effect our relationship with brands and products.

Source: NYTimes

Service connects tourists with locals for real-time advice in South Korea

Launched in July 2019, the Sidekick platform lets tourists visiting South Korea chat with locals and receive help and recommendations in real-time. It works with a user’s live chat platform of choice (LINE, WhatsApp, Messenger and WeChat) and provides access to local ‘sidekicks’, who provide tips on restaurants, shopping, etiquette and culture. Tourists are connected with either Korean, English or Japanese speakers who are available from 7 am to 5 pm. The service can be purchased as a one-day, three-day or five-day pass and prices start at USD 20.

Why it’s hot:

  • This points to the future of travel moving away from typical “touristy” things to more bespoke and personalized experiences.
  • It’s a media or channel evolution: From travel guides to channels to blogs to Youtube videos (10 best things to do in X city) to social media travel influencers – to a real time chat service that let’s you live like a local.
  • It also lets you navigate and move around confidently in a country where you aren’t familiar with the language and/or culture.

Source: Trendwatching

Good deals come to those who haggle with a bot


Flipkart, India’s biggest ecommerce retailer, created a voice-based experience enabling customers to haggle for a better deal.

Flipkart gave its online shopping experience a more traditional touch with Hagglebot, which used Google Assistant’s voice technology. When Flipkart shoppers used Google Assistant it encouraged them haggle down the prices of products using their voice.

Flipkart launched several limited-edition products available exclusively via the Hagglebot during its sales promotion. Each day, it released two new products during the sale and crowned the shopper who drove the hardest bargain the ‘Boss’. Whatever deal the ‘Boss’ secured then became the official Flipkart price of that product.

The Hagglebot was created with Google Zoo, the creative think-tank for agencies and brands. Before building the experience the team travelled to thirty bazaars across three cities to identify different bargaining strategies that were commonly used and then simulated them on Hagglebot. The Hagglebot worked with all devices that support Google Assistant, including Android and iOS phones, as well as the Google Home speaker.

Flipkart’s total sales revenue through products offered on Hagglebot reached $12.23m. The experience also had an average engagement time of 6 min 5 seconds, 200 times the average Google Assistant engagement rate, making it Google Assistant’s most engaging experience to date.

Why it’s hot?
A great way to enable adoption of voice technology by merging it with a deep rooted cultural behaviour
In India, the Hagglebot builds on existing cultural behaviour. Bargaining is a deep-rooted part of Indian culture. The Hagglebot humanised transactions to make its Indian consumers feel more at home when purchasing online and, in doing so, bridged the divide between old traditions and new digital experiences.

 

Source: Contagious

Move Over, Alexa

Voice command devices, like Alexa and Siri, enable humans to engage, operate, and interact with technology thanks to the power of voice, but these technologies fail to account for the voiceless among us. Many people— including those suffering from neurodegenerative diseases, paralysis, or traumatic brain injuries— are unable to take advantage of such voice-user interface (VUI) devices. That’s where Facebook Reality Labs (FBR) comes in.

Image result for brain computer interface facebook

FBR has partnered with neuroscience professionals at UCSF to give a voice back to the voiceless by attempting to create the first non-invasive, wearable brain-computer interface (BCI) device for speech. This device would marry “the hands-free convenience and speed of voice with the discreteness of typing.” Although BCI technology is not new, the creation of BCI technology capable of converting imagined speech into text, without requiring implanted electrodes, would be.

Image result for brain computer interface gif

In a recently successful—albeit limited—study, UCSF researchers demonstrated that brain activity (recorded while people speak) could be used to decode what people were saying into text on a computer screen in real-time. However, at this time, the algorithm can only decode a small set of words.

Although promising, such results are preliminary, and researchers have a long way to go until the power of this silent speech interface technology can be harnessed non-invasively and in wearable form. What is more, researchers believe this BCI technology “could one day be a powerful input for all-day wearable [augmented reality (AR)] glasses.”

Why it’s hot

Such a radical innovation would not only help those who can’t speak, it could alter how all people interact with today’s digital devices.

Sourcehttps://tech.fb.com

Print Gets Targeted

Hearst is beginning to roll out MagMatch, a service that translates online behaviors into targeted print ads.

By tracking what readers are doing online, MagMatch will allow Hearst to create personalized ads for subscribers of its magazines. For example, if a Marie Claire subscriber’s online behaviors (searching, reading, clicking on ‘buy’ buttons) suggest an interest in a particular beauty product, Hearst could work with the brand of that product to deliver a targeted ad that appears in that reader’s next issue of Marie Claire.

For subscribed readers, that could look like an ad that addresses them by their name, which is the route that skincare company StriVectin took in the latest issue of Elle as the first brand to buy into the ad offering. The ad (shown above) includes a brief message from Elle and is addressed to the magazine subscriber alongside a picture of StriVectin spokeswoman Lauren Hutton.

Subscribers don’t even need to be logged onto the magazines’ sites for Hearst to capture their first-party data: the company anonymously matches their behavior using third parties.

Why it’s Hot:

Print has come under fire for its lack of targeting capabilities for decades, but will this new tool be enough? While it’s undeniably a step up from the print of the past, these new ads can only be targeted to subscribers, which is a dwindling community. Plus, the extensive lead time for print means people could be served ads for a product they were looking at months prior. For new product releases in specific categories (ex. beauty), however, this tool could be helpful.

Source: Contagious, Adweek

 

The Ethical Tightrope of Targeted Ads

I’m one of three people on MRM’s Programmatic team. Not a lot of people understand exactly what we do or how we do it, but usually what I tell people who ask is:

You know how you’ll have a dream about a specific product and then you wake up and see an ad for it? It’s my job to show you that ad.

On the Programmatic team we build campaigns with very specific parameters set to buy ad space on websites all across the internet, in real-time, without the need for setting up  private deals with those publishers. We utilize audience data (which we get from Google and about a million other data providers) to find the people most likely to engage with our clients’ ads. So, for the GRE we’ll target recent grads, webpages that contain the word “MBA” or “Career” (and thousands of related keywords), etc. For Latuda, we may want to target people who are up late at night playing video games and eating food, as these activities can go hand-in-hand with bipolar depression.

It gets much more granular than that “in the field.” This infinite complexity creates a gap between the Programmatic traders and the people seeing the ads. The general public doesn’t fully understand why they’re seeing ads that are so damn relevant to them, and that’s no accident. While we on the Programmatic team use our powers for good, it’s easy to abuse. As uncovered in the investigation into Russian election hacking, we know that Russia went and used programmatic channels to target very specific demographics and show them what’s essentially propaganda disguised as ads and Facebook posts. This helped weaken some centrists’ more liberal views and strengthen most conservative viewpoints to sway voters toward the right. This isn’t all they did, but this is what’s pertinent to this article.

To shrink this gap in understanding between the advertiser and the consumer (and thereby increase people’s understanding of what data is collected), the New York Times bought their own audience lists and served ads to people all over the internet. Not to get them to subscribe to NYTimes.com, but also to show them what kind of data is out there on you and me. The Times’ ads looked like this:

Image result for nyt this ad thinks

The live ads don’t have the blue text at the bottom, but that is there to show how we use disparate data points to uncover larger psychological and sociological trends. The ads can get even more drilled down, like this one:

Image result for nyt targeted ads

Each of these blue explanations is derived from audience data that’s bought and sold billions of times every day on the Open Exchange, which can be thought of like eBay for ads all over the internet. New York Times bought ad space and audience data from data providers like Acxiom (owned by our own parent company IPG). Here’s an example of what Acxiom (and your friendly neighborhood Programmatic team) does:

Acxiom…makes predictions by comparing specific people to a much larger group. For example, it might predict whether you’ll buy an S.U.V. by looking at people who did buy S.U.V.s, then checking whether you live in a similar region, or whether you’re around the same age and share similar interests. If you’re similar, they’ll put you in a bucket that says you’re highly likely to buy an S.U.V.

For us, a lot of that highly personal audience data is anonymized (so no, I don’t know if you personally are about to buy an SUV even though I can show you SUV ads).

In the title, I refer to my own industry, and indeed my own job, as an “ethical tightrope.” The reason behind this is what makes this subject so hot. At MRM we use our data to inspire people to get their Master’s degrees (GRE), come to America (TOEFL), get life insurance (AAA), and improve their mental health (Latuda). We target people who can be helped best by our ads not just because it makes our KPIs look better, but because that’s the entire point of advertising; showing people how they can improve their lives. What NYTimes did that’s so hot is show the public how easily we can find out personal information about them based on seemingly innocuous points of data. And that this is just the beginning.

You might not be able to fully stop publishers from getting your data, but at least you know a little more of how it works, and why Programmatic is set to become a $69 billion industry by next year.Image result for programmatic ad spend over time

If that’s not hot as hell, I don’t know what is!

//

Source

Google launches ‘Live View’ AR walking directions for Google Maps

Google is launching a beta of its augmented reality walking directions feature for Google Maps.

Originally revealed earlier this year, Google Maps’ augmented reality feature has been available in an early alpha mode to both Google Pixel users and to Google Maps Local Guides, but starting today it’ll be rolling out to everyone,.

Just tap on any location nearby in Maps, tap the “Directions” button and then navigate to “Walking,” then tap“Live View” which should appear near the bottom of the screen.

The Live View feature isn’t designed with the idea that you’ll hold up your phone continually as you walk — instead, in provides quick, easy and super-useful orientation by showing you arrows and big, readable street markers overlaid on the real scene in front of you. That makes it much, much easier to orient yourself in unfamiliar settings, which is hugely beneficial when traveling in unfamiliar territory.

Source: TechCrunch

Why It’s Hot

Seems like a good use of AR for actual utility and building on existing ecosystem.

 

Your Brain, and how we will tap into it.

How easy will journey/relationship maps be with this new Facebook mind reader?

Facebook is funding and assisting in a University of California San Francisco study aimed at giving people with serious brain injuries the chance to type words with only their minds. The research is showing real results and promise. Facebook hopes to use the underlying technology as a basis for a new brain-computer input mode for augmented-reality glasses that you wear all day.

If Facebook’s researchers can get their headset to accurately detect which neutrons are firing, they may be able to use algorithms based on the USCF research to map neurons to specific letters, then words the user is thinking of.

Facebook Envisions AR GLasses that read your thoughts isn’t just a dream

This is not the first time someone tries to map out a brain and how we process speech though! Here is a podcast about it here -> Freakonomics podcast on how your brain tells stories.

And here is a cool interactive 3-D Semantic Map that the Gallant institute created after creating the brain map that the podcast above details. Gallant Lab 3D Brain

Why it’s hot.

Besides being a cool look into what your brain does in your skull all the time. Ever since you were born. that you don’t have to put an ounce of energy into?

It brings the cliche, living in your own world, to reality. And because there are so many questions we can ask about the future, what that means for our brands, or about language and communication in general.

Weightless living, only connecting.

Questions: How much more personal do our messaging have to be to match this sort of personal 24/7 virtual reality? In how many ways do/can we understand language and meaning? What comes first the language or the meaning?- If you think its meaning, then what happens when you hear a word you don’t know? -If you think its language, what did you naturally feel for a caretaker before you knew the word for it? What would life be like if we could communicate via telepathy -can we communicate solely via meaning? If we could communicate only via meaning, would there ever be arguments? Will there be a group telepathy meeting?

And many more.

 

Apple and New Museum launch AR art tours

A new way for people to experience the city!

A new way for artists to engage the public!

A new way to think about experiencing space!

Brought to you by Apple! Apple’s brand and value proposition permeates this entire experience.

Why it’s hot

Apple is positioning itself as a brand that can bring a new magical realm to life. As we work out the ways in which AR will play a role in our lives, this project sells AR in a surprising and fun way, perhaps warming people up to the idea that a life lived with a layer of AR mapped over the physical world would be desirable.

Source: Dezeen

Make getting drunk great again

British data science company DataSparQ has developed facial recognition-based AI technology to prevent entitled bros from cutting the line at bars. This “technology puts customers in an ‘intelligently virtual’ queue, letting bar staff know who really was next” and who’s cutting the line.

“The system works by displaying a live video of everyone queuing on a screen above the bar. A number appears above each customer’s head — which represents their place in the queue — and gives them an estimated wait time until they get served. Bar staff will know exactly who’s next, helping bars and pubs to maximise their ordering efficiency and to keep the drinks flowing.”

Story on Endgadet

Why it’s Hot

Using AI to help solve these types of trifling irritations is better than having to tolerate other people’s sense of entitlement, though it also highlights the need to police rude behavior through something other than raising your kids well.

The Closest Drone to Bike Delivery is Here

The newest entrant into the drone food delivery space is using slower speed to gain a competitive edge. The Rev-1 by Refraction AI is unique in that operating at a max of 15 miles per hour, it can travel both on the road and in bike lanes–giving it more flexibility to avoid causing traffic. The company’s vision is for the drone to emulate the patterns of a cyclist.

The vehicle is currently being tested in Ann Arbor, Michigan, where it delivers food from local restaurants in a range of .5 to 2.5 miles. It’s 6 cubic feet of space is enough to hold around four grocery bags.

Another competitive advantage is the Rev-1’s lower cost of $5,000. Instead of using lidar sensors like most other drones, it uses multiple cameras in combination with ultrasound to see and navigate.

Why It’s Hot

The smaller, slower drone is well-positioned for local deliveries, with a high potential to help solve the last-mile delivery challenge in a cost-efficient manner.

Source

Don’t hold the phone

Soli Pixel 4 Sensors

For the past five years, our Advanced Technology and Projects team (ATAP) has been working on Soli, a motion-sensing radar. Radar, of course, is the same technology that has been used for decades to detect planes and other large objects. We’ve developed a miniature version located at the top of Pixel 4 that senses small motions around the phone, combining unique software algorithms with the advanced hardware sensor, so it can recognize gestures and detect when you’re nearby.

Pixel 4 will be the first device with Soli, powering our new Motion Sense features to allow you to skip songs, snooze alarms, and silence phone calls, just by waving your hand. These capabilities are just the start, and just as Pixels get better over time, Motion Sense will evolve as well.

Why it’s hot?
The beginning of the end of touchy feely devices.
How can we bring the insights that inspire our teams to create ideas using project soli?

 

Smart Diapers – it’s about more than just poop!

Pampers has announced a new product called Lumi by Pampers, a “connected care system” to monitor your baby. The package includes a special “smart” diaper, which tracks your baby’s pee and sleep patterns, a mobile app, and Logitech video monitor. The one thing it doesn’t track? Poop.

Introducing the world's first all-in-one Connected Care System

Pricing has yet to be announced, but as a disposable product, they’re likely to become expensive. The bigger question is why, especially since this tracker tracks everything except your child’s poop patterns. This is a bigger trend in the diaper and baby industry overall. Getting “smart” keeps companies and products relevant and as people are starting families later and having fewer babies, Pampers, and other big diaper brands (Huggies) are trying to maintain their bottom lines.

Why it’s hot:

In addition to the “smart” revolution in which we’re currently in the midst, these types of innovations and new utilities don’t always come naturally to every brand. It’s interesting to see how the diaper industry is trying to find its way. We’re also seeing this challenge on Enfamil, which is trying to partner with companies to show their commitment to both babies and moms — while not every baby needs this type of monitoring, it could be an interesting partnership opportunity for the brand.

Article source: Mashable
Additional product links: Pampers

Retail wants a Minority Report for returns

In what now seems inevitable, an online fashion retailer in India owned by an e-commerce startup that’s backed by Walmart is doing research with Deep Neural Networks to predict which items a buyer will return before they buy the item.

With this knowledge, they’ll be better able to predict their returns costs, but more interestingly, they’ll be able to incentivize shoppers to NOT return as much, using both loss and gain offers related to items in one’s cart.

The nuts and bolts of it is: the AI will assign a score to you based on what it determines your risk of returning a specific item to be.This data could be from your returns history, as well as less obvious data points, such as your search/shopping patterns elsewhere online, your credit score, and predictions about your size and fit based on aggregated data on other people.

Then it will treat you differently based on that assessment. If you’re put in a high risk category, you may pay more for shipping, or you may be offered a discount in order to accept a no-returns policy tailored just for you. It’s like car insurance for those under 25, but on hyper-drive. If you fit a certain demo, you may start paying more for everything.

Preliminary tests have shown promise in reducing return rates.

So many questions:

Is this a good idea from a brand perspective? If this becomes a trend, will retailers with cheap capital that can afford high-returns volume smear this practice as a way to gain market share?

Will this drive more people to better protect their data and “hide” themselves online? We might be OK with being fed targeted ads based on our data, but what happens when your data footprint and demo makes that jacket you wanted cost more?

Will this encourage more people to shop at brick and mortar stores to sidestep retail’s big brother? Or will brick and mortar stores find a way to follow suit?

How much might this information flow back up the supply chain, to product design, even?

Why it’s hot

Returns are expensive for retailers. They’re also bad for the environment, as many returns are just sent to the landfill, not to mention the carbon emissions from sending it back.

So, many retailers are scrambling to find the balance between reducing friction in the buying process by offering easy returns, on the one hand, and reducing the amount of actual returns, on the other.

There’s been talk of Amazon using predictive models to ship you stuff without you ever “buying” it. You return what you don’t want and it eventually learns what you want to the point where you just receive a box of stuff at intervals, and money is extracted from your bank account. This also might reduce fossil fuels.

How precise can these predictive models get? And how might people be able to thwart them? Is there a non-dystopian way to reduce returns?

Source: ZDNet

A monkey has been able to control a computer with his brain


Neuralink graphic
N1 sensor.
The N1 array in action.

Neuralink, the Elon Musk-led startup that the multi-entrepreneur founded in 2017, is working on technology that’s based around “threads,” which it says can be implanted in human brains with much less potential impact to the surrounding brain tissue versus what’s currently used for today’s brain-computer interfaces. “Most people don’t realize, we can solve that with a chip,” Musk said to kick off Neuralink’s event, talking about some of the brain disorders and issues the company hopes to solve.

Musk also said that, long-term, Neuralink really is about figuring out a way to “achieve a sort of symbiosis with artificial intelligence.” He went on to say, “This is not a mandatory thing. This is something you can choose to have if you want.”

For now, however, the aim is medical, and the plan is to use a robot that Neuralink has created that operates somewhat like a “sewing machine” to implant this threads, which are incredibly thin (like, between 4 and 6 μm, which means about one-third the diameter of the thinnest human hair), deep within a person’s brain tissue, where it will be capable of performing both read and write operations at very high data volume.

These probes are incredibly fine, and far too small to insert by human hand. Neuralink has developed a robot that can stitch the probes in through an incision. It’s initially cut to two millimeters, then dilated to eight millimeters, placed in and then glued shut. The surgery can take less than an hour.

No wires poking out of your head
It uses an iPhone app to interface with the neural link, using a simple interface to train people how to use the link. It basically bluetooths to your phone,” Musk said.

Is there going to be a brain app store ? Will we have ads in our brain?
“Conceivably there could be some kind of app store thing in the future,” Musk said. While ads on phones are mildly annoying, ads in the brain could be a disaster waiting to happen.

Why it’s hot?
A.I.: you won’t be able to beat it, so join it
Interfacing our brains with machines may save us from an artificial intelligence doomsday scenario. According to Elon Musk, if we want to avoid becoming the equivalent of primates in an AI-dominated world, connecting our minds to computing capabilities is a solution that needs to be explored.

“This is going to sound pretty weird, but [we want to] achieve a symbiosis with artificial intelligence,” Musk said. “This is not a mandatory thing! This is a thing that you can choose to have if you want. I think this is going to be something really important at a civilization-scale level. I’ve said a lot about A.I. over the years, but I think even in a benign A.I. scenario we will be left behind.”

Think about the kind of “straight from the brain data” we would have at our disposal and how will we use it?

 

 

Is FaceApp an Giant Russian Conspiracy… or Just an App with a Really Severe Privacy Policy

FaceApp has had it’s time in the sun over the past two years. Don’t you remember famous hits like “what you look like as a woman/man” or the very un PC “what you look like as another race”?

This week our entire feeds are filled with FaceApp photos of our friends if they were old. As well as warnings about how the tech is Russian. In our friends defense, the tech is VERY eerie.

https://twitter.com/yashar/status/1151311662592532480

Yes, the FaceApp tech has a very scary privacy/end user agreement:

But the best hot take of all comes from VICE… 

It’s not that the App is Russian is bad, it’s us allowing apps to have this much data about us at all…

“Extracting data from unsuspecting users, selling and sharing that data god-knows-where, and justifying it by providing users unreadable privacy policies is a near-universal practice. It transcends Cold War phobias. It’s not Russian. It’s not American. It’s a fundamentally capitalist practice. Companies can only provide free apps and profit if they scrape and share data from the people that use it.”

WHY ITS HOT?

We allow apps like Facebook, Instagram and snapchat to access much deeper levels of our data than they need, but it takes a true Russian scare to call us to action about this issue.

Veloretti Bikes courting car owners in Paris

Paris is Europe’s most polluted capital city. To prevent people from dying of particulate pollution, 2.7 million high-emissions cars are restricted from entering the city on weekdays — with hefty fines for noncompliance. If you work in the city, but can’t afford a new low-emissions car, this is a huge problem. You need to get into Paris, and may in theory also want to curb your emissions, but that’s not your main concern — you need to get to work! So what can you do? You’ll ride the train even though it’s a serious downgrade from your car. You might consider a bike, but making the switch to commuting by bike would require more of a nudge because it entails a bigger change in your lifestyle.

Amsterdam-based Veloretti bikes saw this as an opportunity to give car owners the nudge they needed to make that lifestyle change. They rode the wave of interest in clean mobility and sustainable urban transport during European Mobility Week 2018 by offering personalized bike discounts to 5 million Parisian car owners based on their car’s emissions ratings. This positioned the brand as not only helping car-owners, but helping the city itself solve its pollution problems.

The brand plugged the public database of license plates into a Shopify script, converting plates into coupon codes, which users could enter on Veloretti’s site. This gave Veloretti emissions information on a prospective bike-buyer’s car, which was used to automatically calculate a personalized discount at the POS. The worse the emissions score of your car, the deeper discount you got for a new Veloretti bike.

Seeing your car’s negative environmental impact at a time when both pollution and awareness of the need for clean mobility is at its peak in your city was coupled with a commensurate discount on a more sustainable transportation option.

Why it’s hot:

1. License plate discount is only revealed after user has placed a bike into their online cart. Commitment to purchase is strengthened as user sees their emissions score and subsequent discount.

2. Positioning their brand as a solution to pressures from macro forces and social trends (climate change, pollution, fines for driving in Paris, Mobility Week) at the time when awareness of these pressures was at its peak.

3. Highlighting a pain point with a competing product and immediately flipping it into a tangible financial benefit for their product — at the POS.

Read more: Contagious I/O

From Virtual Trees to Living Forests

Ant Forest is an app-based game that is sweeping across China. The game rewards users with green energy points for choosing low-carbon activities like taking public transportation or using less plastic. Once players have earned enough green energy, they can plant a virtual tree in Ant Forest. For every tree planted in the virtual game, a real tree is planted in rural China. The game’s creator says one-hundred million live trees have been planted so far.

Link to video

Why it’s hot: An idea that works like real life gratification of social changes and purpose-driven initiatives that would work well with the new generations of consumers.

Source

Can ‘Big Data’ Help Fight Big Fires? Firefighters Are Betting on It

As out-of-control wildfires in the West grow more frequent and more intense, fire departments in Southern California are looking to big data and artificial intelligence to enhance the way they respond to these disasters.

The marriage of computing, brawn and speed, they hope, may help save lives.

For about 18 months the Los Angeles fire department has been testing a program developed by the WiFire Lab at the San Diego Supercomputer Center that makes fast predictions about where active fires will spread next. The program, known as FireMap, pulls together real-time information about topography, flammable materials and weather conditions, among other variables, from giant government data sets and on-the-ground sensors.

When firefighters across the city are dispatched to respond to brush fires, the department’s leaders at headquarters now run the WiFire program as part of their initial protocol. Then, WiFire’s servers at the San Diego Supercomputer Center in La Jolla crunch the numbers, and the program turns out a predictive map of the fire’s expected trajectory. Those maps can then be transmitted electronically from headquarters to incident commanders on the ground.

The program can make sophisticated calculations in minutes that would take hours to run manually, said Ilkay Altintas, the chief data science officer at the San Diego Supercomputer Center.

Source: NYTimes

Why It’s Hot

Good example of data being put to life-saving use.

Try On a New Lipstick… On YouTube

For those of us who go down YouTube Makeup tutorial rabbit holes, like myself. It’s easy to get discouraged that you don’t have the color to look for yourself… that’s half the point of watching the video.

Well, YouTube has a solve for that (and for makeup brands who want to sell product). Try on while you watch!

“The feature is currently in the very early stages of development — alpha testing — and is being offered to YouTube creators through Google’s  in-house branded content program, FameBit. Through this program, brands are connected with YouTube influencers who market their products through paid sponsorships.”

YouTube has already found that 30% of viewers chose to try the experience when it was available on the iOS app. And those who tried spent an average of more than 80s engaging with the tool.

YouTube’s new AR Beauty Try-On lets viewers virtually try on makeup while watching video reviews

Why it’s hot?

A mix of VR and ecom! This beauty try-on gets over a big makeup hurdle. However this is not totally new. This is something sephora has done on their website, but it’s a much harder UI, this new way with YouTube should score google some extra referral cash, and users entering buy pages would be much more primed.

A New Purpose for the Mannequin Challenge

Back in 2016, the “Mannequin Challenge” took over the Internet, with everyone from Hillary Clinton to Beyoncé posting videos of themselves standing still in various scenarios. Google is now using these videos to train their AI.

One of the top challenges with AI is teaching machines how to move through unfamiliar surroundings. By training machines to interpret 2D videos as 3D scenes, it can help them understand the depth of a 2D image. Google has decided to leverage the thousands of YouTube videos of the Mannequin Challenge as a data source to help machines understand depth since the videos show people posing from all different angles.

Why It’s Hot

These learnings will be particularly useful in AI for self-driving cars.

Source: https://www.technologyreview.com/f/613888/if-you-did-the-mannequin-challenge-you-are-now-advancing-robotics-research/

Facebook announces new cryptocurrency

This week, Facebook revealed their plan to create Calibra, an alternative financial services system that will rely on Libra, its own cryptocurrency powered by blockchain technology. Facebook is planning to launch Calibra’s first product by the first half of 2020 – a digital wallet app that will also be built into WhatsApp and Messenger, allowing users to buy things and send money.

But how will this work? In a nutshell, people will be able to cash in local currency at local exchange points, get Libra, spend it like its normal money (but without high transaction fees or their identity), and then cash out whenever they want.

To protect users’ privacy, Calibra will handle all crypto dealings and store payments data. As a result, users’ data from Libra payments will never mix with their Facebook data and will not be used for ad targeting.

According to Facebook, Libra is meant to address the challenges of global financial services and promote financial inclusion. For example, today about 1.7 billion adults remain without access to a bank account and $50 billion are lost annually  due to exploitative remittance service charges. With Libra, people will be able to send and receive money at low to no cost, small businesses will be able to accept digital payments without credit card fees, and overall financial services will be more accessible.

However, despite these potential benefits, Facebook’s venture into the financial services industry has raised some concerns. People are questioning Facebook’s motives as well as the usefulness, stability and transparency of cryptocurrencies. Furthermore, given Facebook’s troubled history with privacy breaches, its commitment to protecting user-data and privacy is under scrutiny.

Why it’s hot: 

This is the first time a “mainstream” company attempts to get involved in the world of cryptocurrencies and, if all goes to plan, this new digital currency could fundamentally change global financial systems forever.

Sources: FacebookTechCrunch

“Alexa, am I having a heart attack?”

Almost 500,000 Americans die each year from cardiac arrest, but now an unlikely new tool may help cut that number. Researchers at the University of Washington have figured out how to turn a smart speaker into a cardiac monitoring system. That’s right, in the not-too-distant future you may be able to ask Siri if you’re having a heart attack—even if you’re not touching the device.

Because smart speakers are always passively listening, anticipating being called into action with a “Hey Google” or “Alexa!” they are the perfect device for listening for changes in breathing. So if someone starts gasping and making so-called “agonal breathing” (add that to your Scrabble repertoire) the smart speaker can call for help. Agonal breathing is described by co-author Dr. Jacob Sunshine as “a sort of a guttural gasping noise” that is so unique to cardiac arrest that it makes “a good audio biomarker.” According to a press release, about 50% of people who experience cardiac arrest have agonal breathing and since Alexa and Google are always listening, they can be taught to monitor for its distinctive sound.

On average, the proof-of-concept tool detected agonal breathing events 97% of the time from up to 20 feet (or 6 meters) away. The findings were published today in npj Digital Medicine. Why is it so good at detecting agonal breathing? Because the team created it using a dataset of agonal breathing captured from real 911 calls.

“A lot of people have smart speakers in their homes, and these devices have amazing capabilities that we can take advantage of,” said co-author Shyam Gollakota. “We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there’s no response, the device can automatically call 911.”

Why It’s Hot

Despite the rather creepy notion that Amazon is always listening, this innovation is rather cool. What other kinds of health issues could this predict? As a parent, having a speaker able to predict whether a cough is run-of-the-mill or of the scary croup variety would be invaluable. For health events that need an aural translation, this is one application in the right direction.

Source:Fast Company

Forget a Thousand Words. Pictures Could Be Worth Big Bucks for Amazon Fashion – Adweek

Amazon is rolling out StyleSnap, its AI-enabled shopping feature that helps you shop from a photograph or snapshot. Consumers upload images to the Amazon app and it considers factors like brand, price and reviews to recommend similar items.

Amazon has been able to leverage data from brands sold on its site to develop products that are good enough or close enough to the originals, usually at lower price points, and thereby gain an edge, but its still only a destination for basics like T-shirts and socks. With StyleSnap, Amazon is hoping to further crack the online retailing sector with this new offering.

Why It’s Hot

Snapping and sharing is already part of retail culture, and now Amazon is creating a simple and seamless way of adding the shop and purchase to this ubiquitous habit.  The combination of AI and user reviews in its algorithm could change the way we shop when recommendations aren’t only based on the look of an item, but also on how customers experience it.

 

Source: Forget a Thousand Words. Pictures Could Be Worth Big Bucks for Amazon Fashion – Adweek

Other sources: https://www.cnet.com/news/amazon-stylesnap-uses-ai-to-help-you-shop-for-clothes/

Uber is Ready to Make Drone Food Delivery a Reality

Uber Elevate is betting on drones as the future of food delivery. And the future is coming as soon as this summer, with drone service set to launch in San Diego.

relates to Uber Wants Your Next Big Mac to Be Delivered by Drone

At the launch of the program, drones will not be delivering food directly to customer’s homes due to safety and noise concerns. Instead they’ll be landing in designated zones for pick up by couriers, or directly on the roofs of Uber vehicles, for drivers to complete the delivery.

Reaching speeds up to 70 mph, Uber Elevate’s drone can lead to significant time and cost savings in a food delivery market that is projected to grow 12% a year over the next four years. For a delivery 1.5 miles away, drones can make a trip in 7 minutes as opposed to 21 minutes via car.

Why It’s Hot

As more and more companies are looking to make use of drones as soon as possible, it’s significant that a car service company is leading the way. Beyond revolutionizing food delivery, Uber Elevate can help pave the way for how drones can solve other problems including last-mile delivery.

Source: https://www.bloomberg.com/news/articles/2019-06-12/uber-announces-plans-to-deliver-big-macs-by-drone-this-summer

Can you own your data?

The question is interesting and one that carries a number of issues and questions. In the U.S. marketers are hungry for the best data – the mantra of delivering the best message to the right target at the right time is contingent on what data we can leverage at the individual level. Today we have very good access to data for targeting and messaging, and it’s getting better every day. Companies like Facebook, Google and Amazon are at the forefront (or rather the store front) of that data revolution and are commanding robust ad revenue as a result.

Everything from what you…

  • Like on Instagram
  • Search on Google
  • Check in on Facebook
  • Post on Twitter
  • Tell Alexa
  • Buy on Amazon
  • And more!

…can be used to ascertain your tastes and interest in products and services.

But with the privacy and trust concerns from recent snafus (i.e. Facebook), consumers are wondering how best to protect their data. Some are even pondering if there’s value in “owning” their personal data. But how? We don’t own the databases. We’ve already checked off on the privacy policy and kissing our data away…right?

At least one company is tackling the issue: https://hu-manity.co/

Hu-manity is hoping for a critical mass of users to claim data and set choices. If willing to share data, the app would allow users to reap monetary rewards.

“With the #My31 App consumers can claim a property interest on inherent human data, consent for privacy, authorize for permitted use, and elect for compensation if desired.”

Why it’s hot

If this catches on, it will undoubtedly do two main things: 1. Make data less accessible (and less timely), and 2. Make data more expensive. Marketers may end up depending more on statistical models based on limited data to reach the target audiences.

In an era where marketers have great access to data are becoming even more data-driven, we need to account for a future where greater restrictions and limitations may be put in place.

 

Other Companies:

https://datum.org/https://wibson.org/

Further Reading:

https://bryanjohnson.co/your-data-is-your-property/

https://medium.com/hub-of-all-things/can-you-own-your-data-8a185976ea7d

https://www.technologyreview.com/s/612588/its-time-for-a-bill-of-data-rights/

http://www.bbc.com/capital/story/20180921-can-you-make-money-selling-your-data

Microsoft’s New Surface Hub 2 Will Revolutionize Agency Work

Image result for microsoft surface hub 2 gifLast week Microsoft announced their latest attempt to change agency teamwork forever, and they call this attempt the Surface Hub 2.

They’re branding it as an “interactive whiteboard for business.”

Imagine you’re about to present in a meeting, and in the room with you is ten people and a 50-inch easel with a touchscreen. You scan your fingerprint on the side of the Hub, it recognizes you, and logs you in. Suddenly, everything you just saw at your desk pops up right there on the Hub. You go through your presentation, and someone from the Creative team walks in with their Hub and pushes it next to yours. Now you have an even bigger screen, where you can collaborate seamlessly. Something like this…

Image result for microsoft surface hub 2 gif

This is Microsoft’s dream. They want collaboration in environments like ours to be seamless. The Hub 2S (which launches in June) is aimed at business, not end consumers like you and me. This thing is meant to be bought in bulk by agencies like MRM to help the disparate departments come together and collaborate easier than ever before.

Why It’s Hot

This is hot because it innovates on something we all take for granted. This proves hardware like the Hub 2 can disrupt the agency status quo and bring a sense of experimentation and exploration into the agency’s culture. And that’s something we at MRM can’t get enough of.

Image result for microsoft surface hub 2 gif

If done correctly, Microsoft could be onto something. These devices could do wonders for the work we do. Or, if Microsoft half-asses the experience, or agencies can’t justify owning ten of these at $9,000 per unit, this could flop. Hard.

But their intention, to upgrade and streamline how we all do what we all do, is a damn good one.

Source: https://www.windowscentral.com/microsoft-surface-hub-2

#PRN

DeepMind? Pffft! More like “dumb as a bag of rocks.”

Google’s DeepMind AI project, self-described as “the world leader in artificial intelligence research” was recently tested against the type of math test that 16 year olds take in the UK. The result? It only scored a 14 out of 40 correct. Womp womp!

“The researchers tested several types of AI and found that algorithms struggle to translate a question as it appears on a test, full of words and symbols and functions, into the actual operations needed to solve it.” (Medium)

Image result for home d'oh

Why It’s Hot

There is no shortage of angst by humans worried about losing their jobs to AI. Instead of feeling a reprieve, humans should take this as a sign that AI might just be best designed to complement human judgements and not to replace them.

At Unilever, Resumes are Out – Algorithms are In

The traditional hiring process for companies, especially large organizations, can be exhaustive and often ineffective, with 83% of candidates rating their experience as “poor” and 30-50% of candidates chosen by the company end up failing.

Unilever recruits more than 30,000 people a year and processes around 1.8 million job applications. As you can imagine, this takes a tremendous amount of time and resources and too often talented candidates are overlooked just because they’re buried at the bottom of a pile of CVs. To tackle this problem, Unilever partnered with Pymetrics, an online platform on a mission to make the recruiting process more predictive and less biased than traditional methods.

Candidates start the interview process by accessing the platform at home from a computer or mobile-screen, and playing a selection of games that test their aptitude, logic and reasoning, and appetite for risk. Machine learning algorithms are then used to assess their suitability for whatever role they have applied for, by matching their profiles against those of previously successful employees.

The second stage of the process involves submitting a video interview that is reviewed not by a human, but a machine learning algorithm. The algorithm examines the videos of candidates who answer various questions, and through a mixture of natural language processing and body language analysis, determines who is likely to be a good fit.

One of the most nerve-wracking aspects of the job interview process can be anticipation of the feedback loop, or lack thereof – around 45% of job candidates claim they never hear back from a prospective employer. But with the AI-powered platform, all applicants get a couple of pages of feedback, including how they did in the game, how they did in the video interviews, what characteristics they have that fit, and if they don’t fit, the reason why they didn’t, and what they believe they should do to be successful in a future application.


Why it’s hot:
  Making experiences, even hiring experiences, feel more human with AI – The existing hiring process can leave candidates feeling confused, abandoned, and disadvantaged. Using AI and deep analysis helps hiring managers see candidates for who they are, outside of their age, gender, race, education, and socioeconomic status. Companies like Unilever aren’t just reducing their recruiting costs and time to hire- they’re setting an industry precedent that a candidate’s potential to succeed in the future doesn’t lie in who they know, where they came from or how they appear on paper.[Source: Pymetrics]

a new magic leap for the nba, or vice versa…


This week, notorious mixed reality company Magic Leap announced a new NBA “app” built on its platform.

Per Magic Leap, “Using Magic Leap’s Screens framework, fans can pull up multiple virtual screens to watch live games, full game replays, and highlights playing all at the same time. Only on Magic Leap’s spatial computing platform can these screens be independently scaled to any size and placed in any location. But the really cool stuff? The NBA App on Magic Leap introduces team -vs- team and player -vs- player season-long table top stats comparisons. And while live games are exclusively available for NBA League Pass and NBA Single-Game subscribers, a massive catalog of on-demand content is free for anyone using Magic Leap One.”

Why it’s hot:

Any new platform’s success ultimately depends on people using it. And in order to be useful, it must offer utility. It seems Magic Leap is starting to get into the first of what it believes to be many applications of adding mixed reality layers to our physical world. For several years, they had talked about the device which would enable this. Now, they’ve finally turned to the platform on which to develop experiences. Could this be what the app store was to smart phones? Only time will tell, but it will be exciting to see how Magic Leap and its brand partners develop new ways to experience content and the world with an added immersive layer.

[Source]