Human emotion recognition, we are able to

Human facial expressions can be easily classified into 7 basic emotions: happy, sad, surprise, fear, anger, disgust, and neutral. Our facial emotions are expressed through activation of specific sets of facial muscles. These sometimes subtle, yet complex, signals in an expression often contain an abundant amount of information about our state of mind.

Through facial emotion recognition, we are able to measure the effects that content and services have on the audience/users through an easy and low-cost procedure. For example, retailers may use these metrics to evaluate customer interest. Healthcare providers can provide better service by using additional information about patients’ emotional state during treatment. Entertainment producers can monitor audience engagement in events to consistently create desired content.”2016 is the year when machines learn to grasp human emotions” –Andrew Moore, the dean of computer science at Carnegie Mellon.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

Humans are well-trained in reading the emotions of others, in fact, at just 14 months old, babies can already tell the difference between happy and sad. But can computers do a better job than us in accessing emotional states? To answer the question, I designed a deep learning neural network that gives machines the ability to make inferences about our emotional states. In other words, I give them eyes to see what we can see.

x

Hi!
I'm Mary!

Would you like to get a custom essay? How about receiving a customized one?

Check it out