Neurobiology of Cognition Laboratory | Methods

2D Virtual Reality Reach Setup with Eye Tracker





2D-Virtual-Reality Reach Setup. Using this setup we capture pointing movements which we feedback visually on a CRT screen in real-time. Subjects are viewing this screen via a mirror which, at the same time, blocks direct vision subjects' hands. In such setup movement feedback is, for geometric reasons, perceived in spatiotemporal correspondence with subjects' pointing movements. We use this setup in a variety of tasks and for addressing various questions:
  • Agency: We e.g. rotate the visual feedback about subjects' pointing movements to determine perceptual thresholds at which subjects feel no longer responsible for the sensory consequences of their movement (upper right figure).
  • Learning: By introducing feedback errors we study sensorimotor learning on the perceptual and motor level.
  • Decision Making: We also use this setup for choice experiments in which we mimic naturalistic decision settings. In such a setting subjects for instance have to manually pick one out of several choice items (e.g. consumer products). This allows us to investigate the relation of movement and choice parameters and to infer characteristics of human decision making.


Functional Magnetic Resonance Imaging (fMRI) - VR Reach Setup with Eye Tracker



We use fMRI to capture brain activity while subjects perform a variety of tasks, e.g. to uncover the neural mechanisms underlying agency, learning, movement planning, and decision making (e.g. choice overload). We therefore developed setups that not only allow for a visual stimulation inside the scanner. Importantly, these setups also enable us to track subjects' eye- and pointing- movements and to provide subjects a 2D-VR-Reach-Environment (compare figure) similar to the one in our psychophysics laboratory (see above).


Eye and Pupil Tracking


We register subjects' eye movements in almost all our experiments. We have for instance been interested in why subjects perceive a stable world despite their own eye movements. In other words: how can we tell whether or not a movement on the retina is caused by our own eye movements rather than external motion? Apart from such questions related to agency attribution, we also have investigated eye movements in decision making contexts: in agreement with the assumption that eye movements can bias choice (Shimojo et al., 2013), we show that in case of equally preferred choice options subjects typically choose the option that they look at
more often (see illustration on the right). Finally, eye tracking also allows for the simultaneous assessment of pupil size (compare left image), which e.g. can serve as a simple measure of cognitive load.