I can’t read you like a cake.

Amazon’s algorithm is reading a new emotion on faces: fear

Amazon’s Rekognition is a face detecting software that is supposed to know when you are feeling an emotion.

Technically, the algorithm works by learning how people’s faces usually look when they express fear. Then, when you show it a new image, it can tell you with a certain probability whether that person’s face is communicating the emotion of fear, leaving the human to then decide what to do with the information. The company’s descriptions of the product claim that “Amazon Rekognition detects emotions such as happy, sad, or surprise, and demographic information such as gender from facial images.”

Can we build something that humans don’t even know how to do.

Example, we built calculators, because we have figured out math and needed a quicker way to get passed tedious examples to tackle harder ones.

We rarely understand each other, and emotions are so variable for different people. What may seem like a flash of anger might be disgust for some or a twitch for another.

Answer: It’s not working, but Amazon is still selling it. It falsely matched 20% of California’s state legislatures photos. 20%! They’ve sold it to marketing companies, no surprise. But they are marketing to police forces, and immigration agencies. Where there could be a problem.

“In the past, civil rights groups, AI experts, and even some of Amazon’s own investors have asked the company to stop deploying its facial recognition technology given industry-wide issues with accuracy”

Ethically, how can someone buy half-baked technology? Why would you want half a baked pie?