Abstract:
The lot of researcher have developed intelligent human behaviours system that can effectively perform human behaviours detection method but in order to react appropriately as like a human, the computer would need to have some perception of the emotional state of the human, so that this research is used to evaluate the potential for emotion recognition technology to improve the quality of human- computer interaction in real time. Emotions play an important role in human-to-human communication and interaction, allowing people to express them beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. We present …show more content…
Facial expression carries crucial information about the mental, emotional and even physical states of the conversation. Recognition of facial expression in the input image needs two functions: locating a face in the image and recognizing its expression. When we watch two photos of a human face, we can answer which photo shows the facial expression more strongly. In human interaction, the articulation and perception of facial expressions form a communication channel, that is additional to voice and that carries crucial information about the mental, emotional and even physical states of the conversation. It detects face and ignores anything else, such as buildings, trees and bodies. Face detection [17] can be regarded as a more general case of face localization. In face localization, the task is to find the locations and sizes of a known number of faces (usually one). In face detection, face is processed and matched bitwise with the underlying face image in the database. When we seeing a photos of a human face, we can answer which photo shows the facial expression more strongly.The multisensory data are typically processed separately and only combined at the end. People display audio and visual communicative signals in a complementary and redundant manner. In order to accomplish a human-like multimodal analysis of multiple input signals acquired …show more content…
In 1872- Darwin’s Charles demonstrated the universality of facial expression and their continuity in man and animals and claimed among other things [2]. In early 1970s Paul Ekman has performed extensive study of human facial expressions. 1971 American psychologist Ekman and Friesen defined six basic emotions: angry, disgust, fear, happiness, sadness, and surprise [3]. The approach on Facial Action Coding System (FACS) which separates the expression into upper and lower face action[8]. In 1978 FACS was developed by Ekman [7] for facial expression description. 1978-suwea, et.al, presented a preliminary investigation on automatic facial expression analysis from an image sequence [4]. In 1991-Mase and Pentland used 8 directions of optical flow changes to detect the movement of FACS. Before the year of 2005 the most facial expression recognition systems was based on 2-D static images only. In 2005, the scholars of the university of science and technology put forward a 3-D facial model which is based on the facial expression recognition method [5].In 2008 Cao and Tong proposed a new method based on embedded Hidden markov model and Local binary