Effective Computing: How Does It Work Along With Some Use Cases

What did you feel while seeing the price tag of a product in a supermarket? You may not realize it, but machines can read these incredibly nuanced subtilities of human expressions, and retailers can use it to their advantage. A device loaded with a specific software program can predict whether a person is smiling out of frustration or joy. Human-computer interaction has gone beyond the realms of sci-fi fantasies and turned into reality. The rise of emotionally aware machines has blurred the human-machine divide, and it is now redefining the way people experience technology.

What Is Affective Computing? 

Emotion AI, also known as Affective Computing, is all about how AI can decode the emotional status of a human being by analyzing their facial expressions, such as head motion, facial distortions, movement of jaws, and speech pattern, etc. It detects, recognizes, and emulates human emotions through a programmed AI neural network. Without a doubt, humans can analyze and interpret complex emotional signals better. However, the gap is narrowing faster than you can imagine, thanks to advancements in big data capability and powerful algorithms.  

Using AI to Track Emotional Experiences

Companies around the world are striving to deliver exceptional customer experiences and elicit positive and exciting emotions that prompt the customer to return time after time.  Being able to measure our emotional response to things is not always easy, but a team from the Rotterdam School of Management at Erasmus University (RSM) believe they’ve developed a system that does just that.

In a newly published paper, the team describes how their electroencephalography (EEG) based technology tracks how we feel from moment to moment.