M vs Microsoft: Episode 8

ProjectM

Concealing True Emotion While Posing for the Camera

It’s very challenging for human eyes to detect one’s true emotion without knowing what happened to the person. However, “M” is able to read human emotion comprehensively and accurately.

“M”, the artificial intelligence platform developed by the Project M team, has been trained with 200,000 accurately classified training data. In episode 8, let’s compare M’s and Microsoft Azure’s analysis of this person’s complex emotions.

What happened to her?

She lost in a beauty contest that she really wanted to win. This picture was shot when she was posing next to the winner after the contest. She tried to conceal her true sorrow, contempt, and distress emotions for her lost while posing for the camera.

Here are the results:

M demonstrated a perfectly accurate and comprehensive evaluation than Microsoft Azure. M successfully identified the negative emotions from the person who lost the contest. However, Microsoft Azure, using computer vision, was misguided by the person’s muscle movement and generated an inaccurate result. Microsoft Azure was not able to read the hidden emotions concealed in the person’s face.

Your LIKE means a lot to us!

Follow Project M on:

Company website: http://www.ipmdinc.com

Medium: ProjectM

Facebook: Project M

Instagram: mprojectai

Twitter: @mprojectai

*As of July 2019, the Project M team has devoted 64,000 hours to the AI platform, M. The sample input data we are using is the pure testing data that is completely new to M and Microsoft (assumed) and has never been used to train M, so this comparison is a fair trial between M and Microsoft. We appreciate Microsoft, the leader of the industry and emotional AI sector, for allowing the general public to test their testing site for identifying human emotions based on facial expressions.


read original article at https://medium.com/@projectm.info.ai/m-vs-microsoft-episode-8-c0a9da8a6cfd?source=rss——artificial_intelligence-5