http://ift.tt/eA8V8J
Emotions have evolved in humans for the sole purpose of survival. We constantly scan our environment for dangers and chances to satisfy our fundamental needs. Our mind and bodies act in concert through our emotions. What we feel has become so integral to our lives that our perceptions, beliefs and even the initiative to take action depend on it. The most important influence of human emotions is its vital role in establishing communication with other human beings, most especially our loved ones.
Currently, we have progressed to live in a world of advanced digital technology. We thrive in a world wherein we may spend more time communicating through our gadgets than in person. Although communication has become very convenient and advanced, the drawback is that we have become rather disconnected to one another. Because of this, we are beginning to live in a world that is devoid of emotion. We spend a lot of time with our laptops and cell phones, which do not have an idea of how we feel. Emoticons, although not a bad idea, are hardly able to reflect the full spectrum of emotions conveyed in personal communication. It has become increasingly frustrating to communicate with other people without being able to really express how we feel. We live in a world wherein our emotions get lost in cyberspace. Facial expressions have become replaced by emoticons, laughing has become equated to a LOL.
Because of this problem, researchers are looking for ways to simulate empathy and emotion more effectively in cyberspace. In 1997, Rosalind Picard introduced the concept of affective computing, which is the study and development of systems and devices that can recognize, interpret, process, and simulate human emotions. It combines the disciplines of computer science with psychology and cognitive science.
Modern research highlights several major aspects of perception of emotion from facial expression
The first aspect is recognizing patterns of facial expression during dynamic expression of an emotion. A group of researchers have coded facial actions used by actors, which aim to simulate strong emotions. Unfortunately, they were unable to effectively recognize any complete prototypical pattern for basic emotions. Scientists concluded that an emotion is composed of several units or patterns. Because of this, an emotion can be perceived and analysed using multiple patterns.
The second aspect engages with natural emotion data from human subjects. Another group of researchers compared recorded sequences from actors of acted out data with sequences from real-time recording of strong emotions. The individual frames were examined and rated according to the feelings conveyed by each. Frame to frame changes were rated. Ratings were noted to be higher in the real-time data. The researchers concluded that people expressing real-time emotions were shifting focus more rapidly and that their reaction times were faster. The findings point to the difference between “virtual” and “real-time” data. The variations may further complicate the research studies of emotion perception.
A third aspect involves emotional colouring. Classical research found varied evidence for timing and strength of emotional expression. A different group of researchers showed the effects of identification of subtle emotions. They concluded that the intensity of an emotional display or expression largely affects recognition patterns.
Finally, researchers have studied the Facial Action Coding system. It has a clear advantage over the other methods of study. Scientists have broken down the emotional expressions into action units, depending on the facial muscle group utilized for emotional expression. For example, a smile is called action unit 12, surprise is action unit 26, and so on. This Facial Action Coding System has been devised to be recognized by computers and is called an emotion data point. The units can fire together to portray different emotions. This forces computer systems to locate precise points where information is available to decipher an emotion.
Apart from facial expression, other researchers have taken the initiative to quantify human emotions through gestures, visual aesthetics, emotional speech, visual cues and physiologic monitoring devices such as facial electromyography and galvanic skin response. A multi-modal combination of such studies has also been analysed in an attempt to fuse it with a computer application.
Future applications for such emotion recognition technologies are limitless
Educational technology has recently progressed, and e-learning has become more popular. In this particular technology, computer perception of human emotions will be very advantageous. Presentation styles of a computerized tutor can be adjusted accordingly to a learner that appears to look engaged or bored.
In the medical field, telemedicine is becoming a game-changer. Knowing the client’s emotional state, in a quantifiable manner, increases the accuracy of giving out a more compassionate response from the healthcare specialist. This is not only beneficial for establishing a better relationship between the healthcare provider and the patient, but it can potentially improve the interactions between colleagues.
Some robots have been given emotional expressions in an attempt to improve human-computer interaction. The capability to process affective data can further improve these interactions. The realism will be enhanced for companion devices such as virtual pets, online games, smartphones, and even our laptops. Interaction will be brought to a more personal level.
Another potential application for affective computers would be for monitoring systems. For example, you could have a blood pressure monitoring device that also perceives your emotion. If it perceives an intense emotion during blood pressure measurement, the device can actually tell you if your blood pressure is falsely elevated whenever you are experiencing an intense emotion. It can advise you to rest or have your blood pressure measured again later.
In the advertising world, affective data will be highly valuable. Computers would be able to detect the watcher’s satisfaction through computerized emotional perception. This can potentially forecast the success or the demand for a certain product. This can potentially translate into data important for business.
The future benefits of improving computer-human interactions through emotional perception are limitless. Imagine living in a digital world wherein robots and computers can perceive and react to the way we feel. Perhaps this will improve not only our communication, but also the quality of our lives.
References
Arbib, M., & Fellous, J. (2004). Emotions: from brain to robot Trends in Cognitive Sciences, 8 (12), 554-561 DOI: 10.1016/j.tics.2004.10.004
Aviezer, H., Bentin, S., Hassin, R., Meschino, W., Kennedy, J., Grewal, S., Esmail, S., Cohen, S., & Moscovitch, M. (2009). Not on the face alone: perception of contextualized face expressions in Huntington’s disease Brain, 132 (6), 1633-1644 DOI: 10.1093/brain/awp067
Cowie, R. (2009). Perceiving emotion: towards a realistic understanding of the task Philosophical Transactions of the Royal Society B: Biological Sciences, 364 (1535), 3515-3525 DOI: 10.1098/rstb.2009.0139
Dyck, M., Winbeck, M., Leiberg, S., Chen, Y., Gur, R., & Mathiak, K. (2008). Recognition Profile of Emotions in Natural and Virtual Faces PLoS ONE, 3 (11) DOI: 10.1371/journal.pone.0003628
Joyal, C. (2014). Virtual faces expressing emotions: an initial concomitant and construct validity study Frontiers in Human Neuroscience, 8 DOI: 10.3389/fnhum.2014.00787
Kim, Y., Kang, S., Lee, S., Jung, J., Kam, H., Lee, J., Kim, Y., Lee, J., & Kim, C. (2015). Efficiently detecting outlying behavior in video-game players PeerJ, 3 DOI: 10.7717/peerj.1502
Vía Brain Blogger http://ift.tt/1tq027z
via WordPress http://ift.tt/25VjVRO