One of the most important ways to provide a tech-based system with the ability to
perceive its interlocutor (user) is through the use of HCI instruments.
Some of these instruments are able to gather data of physiological functions of the user.
Some others gather information from facial gestures, body postures or about the pressure
the user puts over an object.
All this data is directly related with responses that occur in conjunction
with changes to an individual thought patters, emotional state and physical behavior.
In comparison to data gathered from self report measures of emotions, biofeedback measurements and the ones gather from facial-gestures and brain waives offers a less biased and novel sources of information that can lead to better understanding the user interactions with a system, e.g. understand the student interactions while learning through a computer-based system (computer-based tutor).
Using this information it is possible to create an affective model of the user, and use this model to provide the computer with a certain level of empathy.
In this page could be found the description of different technology that makes this possible, and some software we have been developed to manage these technology and data.
|If you publish work based on results generated by the software provided in this site, please cite: Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Atkinson, R., and Burleson, W. (2011). An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework. In Proceedings of 9th Working IEEE/IFIP Conference on Software Architecture (WICSA'11). in press.>|