Applied Perception in Graphics in Visualization (APGV)

This blog is meant to be my take on the field of Applied Perception in Graphics in Visualization. Originally named APGV, the ACM Symposium on Applied Perception is a hotbed for research around eye tracking, virtual reality, and advances in perception computing. This includes rendering, animation, haptics, etc. It features brief summarizes and thoughts on new and exciting work in the field that is relevant for my research and interests.

Work in progress. Expect format to change.

Blog Post #1: Electrogastrogram

Many are familiar with devices that measure bio-physical signals from our body. Perhaps you have used an electrocardiogram (ECG) to measure heart rate, or an electroencephalogram (EEG) to measure brain activity. All of these devices work use electrodes placed in specific regions to measure electrical activity and predict emotional arousal [1]. Recently, I discovered such a device that measures electrical signals of the stomach, called an Electrogastrogram (EGG). I had no idea that we could detect features of stomach activity through such electrodes. Not only is this an interesting source of data, but researchers have also found that frequency domain are strongly correlated with extended use of a VR device [2]. While slightly invasive this type of sensor may be useful for researchers exploring simulator sickness or bio-physical human responses.


1: Suhaimi, N. S., Yuan, C. T. B., Teo, J., & Mountstephens, J. (2018, March). Modeling the affective space of 360 virtual reality videos based on arousal and valence for wearable EEG-based VR emotion classification. In Signal Processing & Its Applications (CSPA), 2018 IEEE 14th International Colloquium on (pp. 167-172). IEEE.

2. Dennison, M. S., Wisti, A. Z., & D’Zmura, M. (2016). Use of physiological signals to predict cybersickness. Displays, 44, 42-52.

Blog Post #2: Eye Tracking in Games

Eye tracking company Tobii has invested in gaming as an application for gaze interaction. They have integrated eye tracking in Ubisoft's The Division 2, an action RPG and shooter. The company currently includes a copy of the game with a purchase of the Tobii 4C eye tracker ($169). The game has been developed to include gaze based features such as cover to gaze, throw to gaze, aiming at gaze and camera controls. Eye tracking enables a more natural interaction, however gamers may need to get used to this style of controls. For example, gamers have learned to use their controller to quickly turn towards targets when they throw grenades or intend to shoot, and it may be hard to break this reflex. With gaze, the user simply looks at the target and presses the button that will throw a grenade in that direction. Done properly, this reduces the time needed to perform the action without interrupting the character's motion or position in cover. The next series of questions with eye tracking might be centered around how much of an advantage does playing with an eye tracker give? For professional tournaments will eye tracking be disabled for all, or enabled for all? Currently most gamers do not have eye tracking devices, and it cannot be adopted professionally until the devices become more common. However, with reasonable cost eye trackers may start to play a role in competitive gaming. They also enable a long term goal of performing analytics and analysis of player performance while they game competitively.