Face recognition software measures various parameters in a mug shot, such as the distance between the person’s eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people in the database that have been tagged with a given name. Now, research published in the International Journal of Computational Vision and Robotics looks to take that one step further in recognizing the emotion portrayed by a face.
Dev Drume Agrawal, Shiv Ram Dubey and Anand Singh Jalal of the GLA University, in Mathura, Uttar Pradesh, India, suggest that the recognition of emotions by future artificial intelligences, in the form of computers or robots, will provide a missing link between the human and machine environments without which appropriate interactions between the two domains may never be entirely successful. The team has taken a three-phase approach to a software emotion detector. The first involves developing an algorithm that can precisely identify and define the features of the human face. The second then analyses the particular positions and shapes of the face. The third phase then associates those features with a person’s emotional state to decide whether they are happy, sad, angry, surprised, fearful or disgusted. Preliminary tests gave a 94 percent success rate the team reports.
While Mehrabian’s 1960s notion that half of human communication is non-verbal has been debunked several times, there remains the fact that facial expressions and body language do convey a lot of information about a person’s thoughts and emotional state. Such information, if it could be interpreted by a computer would allow us to enhance human-computer interactions. Imagine, whimsically, that one’s laptop or smart phone could change the background image or shuffle your music based on whether you had a happy or sad expression. In a more serious setting, the recognition of anger, pent-up aggression, or fear at airport screening might allow suspicious individuals to be channeled sooner rather than later to the security office while those with nothing to hide would be funneled through to the usual physical checks with less delay.
“Our experimental results suggest that the introduced method is able to support more accurate classification of emotion classification from images of faces,” the team says. They add that additional refinements to the classification algorithms will improve their emotion detector still further.
Agrawal, D.D., Dubey, S.R. and Jalal, A.S. (2014) ‘Emotion recognition from facial expressions based on multi-level classification’, Int. J. Computational Vision and Robotics, Vol. 4, No. 4, pp.365–389.
Emotion detector is a post from: David Bradley's Science Spot
via Science Spot http://ift.tt/1t74RPa
Dev Drume Agrawal, Shiv Ram Dubey and Anand Singh Jalal of the GLA University, in Mathura, Uttar Pradesh, India, suggest that the recognition of emotions by future artificial intelligences, in the form of computers or robots, will provide a missing link between the human and machine environments without which appropriate interactions between the two domains may never be entirely successful. The team has taken a three-phase approach to a software emotion detector. The first involves developing an algorithm that can precisely identify and define the features of the human face. The second then analyses the particular positions and shapes of the face. The third phase then associates those features with a person’s emotional state to decide whether they are happy, sad, angry, surprised, fearful or disgusted. Preliminary tests gave a 94 percent success rate the team reports.
While Mehrabian’s 1960s notion that half of human communication is non-verbal has been debunked several times, there remains the fact that facial expressions and body language do convey a lot of information about a person’s thoughts and emotional state. Such information, if it could be interpreted by a computer would allow us to enhance human-computer interactions. Imagine, whimsically, that one’s laptop or smart phone could change the background image or shuffle your music based on whether you had a happy or sad expression. In a more serious setting, the recognition of anger, pent-up aggression, or fear at airport screening might allow suspicious individuals to be channeled sooner rather than later to the security office while those with nothing to hide would be funneled through to the usual physical checks with less delay.
“Our experimental results suggest that the introduced method is able to support more accurate classification of emotion classification from images of faces,” the team says. They add that additional refinements to the classification algorithms will improve their emotion detector still further.
Agrawal, D.D., Dubey, S.R. and Jalal, A.S. (2014) ‘Emotion recognition from facial expressions based on multi-level classification’, Int. J. Computational Vision and Robotics, Vol. 4, No. 4, pp.365–389.
Emotion detector is a post from: David Bradley's Science Spot
via Science Spot http://ift.tt/1t74RPa
No comments:
Post a Comment