Our research at the iVizLab is centred on the development of autonomous, anthropomorphic virtual characters. In other words, 3D representations of humans, that behave autonomously and show behaviour that is plausibly human.
On the one hand we develop these characters for the application in computer games, caregiving, training and many other domains where humans interact with each other. On the other hand we use the characters to develop and test models of human cognition, affect, personality, behaviour regulation etc.
Virtual Character Projects
Social spatial behaviourPlausible spatial behavior is a key capability of autonomous anthropomorphic virtual characters. Currently there is a gap in our understanding of human spatial behavior in the larger scale social settings and over longer periods of time. In this project we develop a social navigation model that aims at generating human-like spatial behavior in a social setting with group dynamics. We employ an engineering approach to defining a dynamic representation of interest and then use it as the psychometric function that regulates the behavior.
youtube video: https://www.youtube.com/watch?v=ASmDuz7zRqw
Personality and affective behaviourIn this project we investigate personality attribution using a paradigm where a human participant is tasked with assessing the personality of an autonomous virtual characters (AVC) responding to affective stimuli. We evoke different impressions of personality by varying the characteristics of the mapping between affective quality of the stimulus and overt behavioral response of the AVC. The characteristics of the mapping draws upon prior empirical evidence and the BIS/BAS model of personality
Real-time, real-world interactionOverall goal of this project is a real-time 3D realistic character system that reacts in affective ways to internal and external social situations. The external social situations can be via real people in front of the character screen/projection, being sensed by systems like Kinect. The virtual characters have their own social/emotion profile that will react to the data in different ways.
- Acquire and process sensory information about the human
- Identity, location, emotion, movement, gesture
- Modular, plug-in based architecture allows using a wide range of sensing devices
- Overhead camera, Kinect, Emotiv, etc
- Functional analogon to the Cerebrum in the mammalian brain
- Synthetic cognition and behaviour regulation
- Emotion and mood are a functional part of this component and depend on sensory input, memory and personality
- Realization: algorithmic (e.g. in python), neurobiological (e.g. using the large-scale neuronal system simulator “iqr”), or a hybrid of both.
- Corresponds to the body, including brainstem and spinal cord of a biological human
- Graphical rendering of the representation
- Detailed, low-level behaviour planning and movement execution
Related publicationsInderbitzin, M., Väljamäe, A., Calvo, J. M. B., Verschure, P. F. M. J., & Bernardet, U. (2011). Expression of Emotional States during Locomotion based on Canonical Parameters. In EmoSPACE 2011 (pp. 809–814). IEEE. doi:10.1109/FG.2011.5771353
Karimaghalou, N., Bernardet, U., & DiPaola, S. (2014). A model for social spatial behavior in virtual characters. Computer Animation and Virtual Worlds, 25, 507–519. doi:10.1002/cav.1600
Bernardet, U., & DiPaola, S. (2014). Affective response patterns as indicators of personality in virtual characters. Poster presented at the 36th Annual Conference of the Cognitive Science Society. Québec City, Canada