In many scenarios, we tend not to express our true feeling to others. The woman in the picture conceals her complicated feeling by showing a “smiling” face when she realizes her rival friend got promoted instead of her. But…Do you think she is actually happy?
There are many existing artificial intelligent platforms aiming to read human facial expression, but currently M, an AI human emotion recognition platform developed by IPMD, Inc. has the most accurate analysis of human emotions, surpassing Microsoft Azure.
How do M and Microsoft Azure perform in analyzing the person’s emotions? Let’s take a look at their results:
Microsoft Azure’s algorithm was to analyze human emotions through facial muscle movements. However, this algorithm performs poorly when the person’s facial muscles are gently moved. In this case, similar to episode 2, Microsoft Azure was “tricked” by the slight lift at the person’s lip corner. As a result, Microsoft Azure mistakenly recognized the face as a smiling face and read 85.476% happiness it. Ironically, this is completely opposite to what was happened to the person: her rival friend got promoted instead of her.
M, the AI platform developed by the Project M team’s more than 54,000 hours’ effort as of February 1st, 2019, accurately reads micro facial expressions. It’s algorithm accurately and precisely reads 34.2% anger, 20.5% disgust, 18.5% surprise, and 7.9% sadness from the face, which correctly matched the context.
Interested? Here’s the official website of Ipmd, Inc.
Click to learn more about M.
Click to learn more about Microsoft Azure.
The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and is never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.