You can now practice firing someone in virtual reality

Meet Barry, you are about to ruin his day, over and over and over again. ‘his sole purpose in life is to listen patiently, and then protest or sob a little as you fire him from an imaginary job in virtual reality.’

Barry was created for managers to work on and practice the ‘soft skill’ of firing someone in Virtual reality. Created by Talespin, a VR workplace training company.

He will protest, put his head in his hands, sob and other actions will have him shout and yell.

Many people are using these VR tools for training.  Imagine a world with perfect interactions. You try to ask someone out on a date and you have a chatbot for that and their personality type, or you strike up conversations based on how you interact. What if it went as far as making a room laugh, because you practiced it, it can work out well.

What ways can this become an opportunity for us?

What if ADT had a VR set where you watch someone ransack your house and you can’t stop them. Or an interaction with your home and a sort of meditation where the idea of safety and comfort are a focus, you dim the lights to a blue color, and that promotes safety, or different shades of blue.

What if we created what MIT did with designers:

So maybe not 100% but what if our maps were VR? Each decision point would pop up and then you would chose which way to go. Like an adventure game.

You can now practice firing someone in virtual reality

Come Fly with Me, Lets Fly, Lets Fly to JFK

You have a flight coming up? You need to get to JFK? Forget about giving yourself time, just take Ubercopter which launched today for all Uber users. (Soft launch was in the summer for Premium members).

You can only take it to one place in Lower Manhattan near the Staten island ferry. Order a seat for $200-225. And you have to watch a safety video before takeoff. Using the helicopters operated from HeliFlite Shares a licensed company.

Also, the copter only shows up if you’re in the geofenced lower Manhattan area.

They say its price is comparative to other companies doing the same thing. But Uber does offer on ground transportation after your flight.

It’s hot not because its helicopters that don’t actually solve a problem, it only solves procrastination but so would hiring an assistant for a day at 200-225.It IS hot because this is a stepping stone for UberAir their all electric ride sharing network launching in 2023. But they have to really burn through a little more fossil fuel before they can go electric.

https://www.reuters.com/article/us-uber-copter/uber-makes-jfk-airport-helicopter-taxis-available-to-all-users-idUSKBN1WI13N?feedType=RSS&feedName=technologyNews

https://www.engadget.com/2019/10/03/uber-copter-jfk-trips-october-7/

Leggo!

During AFOL’s Lego Ideas first live stream EVER, they announced that a 123 Sesame Street Lego set is in development.

Lego Ideas is where AFOL can have their (and your) dreams come true.

Fans can submit ideas and any submission that gets back by 10,000 supporters goes to the board for review.

This is how the Women of NASA Leggos are among many successful products to have originated on Lego Ideas.

123 Sesame street was 1 of 10 fan pitches that the Leggo Ideas team announced to be in development.

Among them :

-Playable Leggo Piano

-The Office NBC Leggo Set

-Machu Piccu

being developed along side our beloved sesame street.

The 123 Sesame Street wont be for the usual preschool target will require more skills and be a bigger kid (or millennial).

10,000 Votes for Leggo On Sesame Street

There is a community that has built on Lego Ideas where you talk about your love for Legos. Along with Contests and Activities to keep you engaged.

Currently if you want to vote,

I can’t see it.

The Non-profit > Braille institute wants to reach more people. Especially when there are millions of people that are blind and a growing population that have increasing vision impairment.

Which leads to – Atkinson Hyperlegible.

This typeface at first looks like any other just like the one you’re seeing right now. But, theirs are distinctly different when you have 0 and O, or what about 1 and l.

I think this is incredibly helpful, is it something that will take? In a world where websites are geared toward making sure that its accessible? I think its a clever way to bring awareness. But will it go unseen?

This typeface hides a secret in plain sight. And that’s the point

 

OK, Focus!

  • Concentration with the world at your fingertips is hard.
    • Its harder when you have pinging of texts, emails.
  • Cambridge researchers say they have a solution!
      • A GAME that helps you stay focused!(What an oxymoron right?)

      (Also a fun fact, anytime you get sidetracked, it takes an average of 28 minutes to get back on task.)

      The game is called Decoder.

    • You detect a series of numbers (3,5,7) (2,4,6) (4,6,8).
    • These numbers are the codes that direct them to clues to solve missions.
  • This works the frontal parietal lobe (involves concentration). The idea is that the more you work this, the stronger it gets.
  • Plasticity of the brain, learning or strengthening new things.

FINDINGS

  • After users played decoder for 8 hours (within the month) they showed significantly better attention than people who didn’t play the game.
  • SOME say it is comparable to the effects of Ritalin (ADHD drug) Some researchers are skeptical

Why is this hot?

  1. This feels like a big trend. With timers on YouTube, and Apple reminding us screen time to get away from our phones. This game helps us focus.
  2. Attention is a serious thing to think about especially, that kids who are younger and younger are using tech. What is it doing to their attention spans?

Ending Q’s:

  • Is there a way for a brand to either use these techniques to harness attention, or get behind this and help push this trend of put your phone down?
  • Which brands do you think should help you put your phone down?
  • Should this be something that companies can use for their employees? 20 minute decoder break.
  • Would you use this?

Brain Training App

I can’t read you like a cake.

Amazon’s algorithm is reading a new emotion on faces: fear

Amazon’s Rekognition is a face detecting software that is supposed to know when you are feeling an emotion.

Technically, the algorithm works by learning how people’s faces usually look when they express fear. Then, when you show it a new image, it can tell you with a certain probability whether that person’s face is communicating the emotion of fear, leaving the human to then decide what to do with the information. The company’s descriptions of the product claim that “Amazon Rekognition detects emotions such as happy, sad, or surprise, and demographic information such as gender from facial images.”

Can we build something that humans don’t even know how to do.

Example, we built calculators, because we have figured out math and needed a quicker way to get passed tedious examples to tackle harder ones.

We rarely understand each other, and emotions are so variable for different people. What may seem like a flash of anger might be disgust for some or a twitch for another.

Answer: It’s not working, but Amazon is still selling it. It falsely matched 20% of California’s state legislatures photos. 20%! They’ve sold it to marketing companies, no surprise. But they are marketing to police forces, and immigration agencies. Where there could be a problem.

“In the past, civil rights groups, AI experts, and even some of Amazon’s own investors have asked the company to stop deploying its facial recognition technology given industry-wide issues with accuracy”

Ethically, how can someone buy half-baked technology? Why would you want half a baked pie?

 

 

Your Brain, and how we will tap into it.

How easy will journey/relationship maps be with this new Facebook mind reader?

Facebook is funding and assisting in a University of California San Francisco study aimed at giving people with serious brain injuries the chance to type words with only their minds. The research is showing real results and promise. Facebook hopes to use the underlying technology as a basis for a new brain-computer input mode for augmented-reality glasses that you wear all day.

If Facebook’s researchers can get their headset to accurately detect which neutrons are firing, they may be able to use algorithms based on the USCF research to map neurons to specific letters, then words the user is thinking of.

Facebook Envisions AR GLasses that read your thoughts isn’t just a dream

This is not the first time someone tries to map out a brain and how we process speech though! Here is a podcast about it here -> Freakonomics podcast on how your brain tells stories.

And here is a cool interactive 3-D Semantic Map that the Gallant institute created after creating the brain map that the podcast above details. Gallant Lab 3D Brain

Why it’s hot.

Besides being a cool look into what your brain does in your skull all the time. Ever since you were born. that you don’t have to put an ounce of energy into?

It brings the cliche, living in your own world, to reality. And because there are so many questions we can ask about the future, what that means for our brands, or about language and communication in general.

Weightless living, only connecting.

Questions: How much more personal do our messaging have to be to match this sort of personal 24/7 virtual reality? In how many ways do/can we understand language and meaning? What comes first the language or the meaning?- If you think its meaning, then what happens when you hear a word you don’t know? -If you think its language, what did you naturally feel for a caretaker before you knew the word for it? What would life be like if we could communicate via telepathy -can we communicate solely via meaning? If we could communicate only via meaning, would there ever be arguments? Will there be a group telepathy meeting?

And many more.