Affective computing
It has been suggested that Rosalind Picard and Talk:Rosalind Picard#Merger proposal be merged into this article. (Discuss) Proposed since September 2007. |
Template:Tfd Affective computing is a branch of artificial intelligence that deals with the design of systems and devices which can recognize, interpret, and process emotions. It is an interdisciplinary field spanning computer sciences, psychology, and cognitive science.
Affective Computing is also the title of a textbook on the subject, by Professor Rosalind Picard, published in 1997 by MIT Press[1]. The origins of the field trace back to Picard's 1995 seminal paper on Affective Computing.[2]
Areas of affective computing
Detecting and recognizing emotional information
Detecting emotional information usually involve passive sensors which capture data about the user's physical state or behavior. The data gathered is often analogous to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture and gestures, while a microphone might capture speech. Other sensors detect emotional cues by directly measuring physiological data; such as skin temperature and galvanic resistance.
Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. This is done by parsing the data through various processes such as speech recognition, natural language processing, or facial expression detection.
Emotion in machines
Another area within affective computing is the design of computational devices having either innate emotional capabilities or capable of convincingly simulate emotions (see strong AI). The possession of innate emotion in non-human intellects is primarily a philosophical topic, since sapience is considered a pre-requisite for the ability to process emotions and there are currently no known models of sapiency besides humans. A more practical approach, based on current technological capabilities, is the simulation of emotions in conversational agents. The goal of such simulation is to enrich and facilitate interactivity between human and machine. While human emotions are often associated with surges in hormones and other neuropeptides, emotions in machines might be associated with abstract states associated with progress (or lack of progress) in autonomous learning systems. In this view, affective emotional states correspond to time-derivatives (perturbations) in the learning curve of an arbitrary learning system.
Emotional understanding
Emotional understanding refers to the ability of a device not only to detect emotional or affective information, but also to store, process, build and maintain an emotional model of the user. The goal is to understand contextual information about the user and its environment, and formulate an appropriate response. This is difficult because human emotions arise from complex external and internal contexts.
Possible features of a system which displays emotional understanding might be adaptive behavior, for example, avoiding interaction with a user it perceives to be angry. A probable use of such capability would be ensuring data integrity and security.
Technologies of Affective computing
Emotional Speech
Emotional speech processing recognizes the user's emotional state by analyzing speech patterns. Vocal parameters and prosody features such as pitch variables and speech rate are analyzed through pattern recognition.
Some related works: Dellaert[3], Lee[4]
Emotional inflection and modulation in synthetized speech, either through phrasing or acoustic features is useful in human-computer interaction. Such capability makes speech natural and expressive. For example a dialogue system might modulate its speech to be more puerile if it deems the emotional model of its current user is that of a child.
Facial Expression
The detection and processing of facial expression is achieved through various methods such as optical flow, hidden Markov model, neural network processing or active appearance model.
Body gesture
Body gesture is the position and the changes of the body. There are many proposed methods[5] to detect the body gesture.
Hand gestures have been a common focus of body gesture detection, apparentness methods[6] and 3-D modeling methods are traditionally used.
Potential Applications
Template:Copyedit In e-learning applications, affective computing can be used to adjust the presentation of a computerized tutor when a learner is bored, interested, frustrated, or pleased.[7]
Psychological health services such as counseling, can benefit from affective computing applications, for example, when determining a client's emotional state.
Would help in sending a message through color or sound, which would express a person’s state of emotion to other people eliminating unneeded conflicts, or changing the mood of that person.
Robotic systems capable of processing affective information might exhibit higher flexibility when working in uncertain or complex environments. Companion devices such as digital pets may also make use of affective computing abilities to enhance realism and provide a higher degree of autonomy.
Affective computing has also been suggested to apply in monitoring society. For example a car which can monitor the emotion of it's occupants may engage additional safety measures, such as alerting other vehicles, if it detects the driver is angry.
Affective computing has a high potential of application in human computer interaction, ideas like affective mirrors which let the user see how he performs in front of others, emotion monitoring agents that warn you before you send a negative or angry email, or even music players which can create relationships between music and emotions to select tracks based on mood have also been suggested.
Affective computing is also being applied to the development of prosthetic devices for use in alleviating autism.[8]
Also see: Affective design
Application Examples
- Wearable computer always make use of affective technologies, such as detection of biosignals
- Human–computer interaction
- AutoTutor
- Affective Tangibles
- Affective Learning Companions
- RoCo is a robotic sociable computer
- Kismet
External links
- Affective Computing Research Group at the MIT Media Laboratory
- Publications of the Affective Computing Research Group at the MIT Media Laboratory
- Affective Media - the Emotion Engineering Laboratory
- Emotive Computing Group at the FedEx Institute of Technology
- An interactive experiment: detection of emotion
- Multimodal Human Computer Interaction Project
- Emotional & Expressive Synthesized Speech
- Facial Expression Resources on the Web
- The HUMAINE Portal on emotion-oriented computing
References
- ↑ MIT Press Publication of Affective Computing, 1997
- ↑ "Affective Computing" MIT Technical Report #321 (Abstract), 1995
- ↑ Dellaert, F., Polizin, t., and Waibel, A., Recognizing Emotion in Speech", In Proc. Of ICSLP 1996, Philadelphia, PA, pp.1970-1973, 1996
- ↑ Lee, C.M.; Narayanan, S.; Pieraccini, R., Recognition of Negative Emotion in the Human Speech Signals, Workshop on Auto. Speech Recognition and Understanding, Dec 2001
- ↑ J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999
- ↑ Vladimir I. Pavlovic, Rajeev Sharma, Thomas S. Huang, Visual Interpretation of Hand Gestures for Human-Computer Interaction; A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997
- ↑ AutoTutor
- ↑ Projects in Affective Computing