Workshop Program

The ActivEye workshop will be held online using the Meetanyway platform and under the ETRA 2021 registration guidelines. The Workshop is half-day long on May 27th, 2021.

Video recording of the workshop is available on YouTube!

Program At-A-Glance (All times in PDT)

ActivEye.pdf

Full Program

Keynote: Kai Dierkes, Ph.D.

Lead R&D Engineer, Pupil Labs Inc.

Pupil Invisible - Eye tracking beyond the lab

Head-mounted eye trackers hold the potential for enabling insights into human behavior, physiology, and health also outside of controlled lab environments. Data acquisition in truly unconstrained everyday settings, however, is often hindered by long setup times, the need for frequent recalibration, and behavioral distortion resulting from the often peculiar appearance of current head-mounted eye trackers. Pupil Invisible, the latest generation eye tracker sold by Pupil Labs, was engineered to tackle these limitations. In my talk, I will give a high-level overview of Pupil Invisible technology, hardware, and operation. Gauging its gaze-estimation performance along a number of dimensions, I will demonstrate that without the need of a calibration, Pupil Invisible provides gaze estimates which are robust to perturbations, including outdoor lighting conditions and slippage of the headset. In particular, I will describe how by drawing from ideas from directional statistics, we have quantified gaze-estimation accuracy directionally resolved over the whole field of view. Study design and execution with Pupil Invisible is facilitated by Pupil Cloud, a dedicated cloud storage and analysis platform. I will highlight the potential of Pupil Cloud by introducing one of its latest analysis features, the fully automatic Reference Image Mapper.

Talk Session 1: System Integration

  • Ergonomic Design Development of the Visual Experience Database Headset

Bharath Shankar, Christian Sinnott, Kamran Binaee, Mark D Lescroart, Paul MacNeilage

Departments of Psychology & Neuroscience, University of Nevada Reno, Reno, Nevada, USA

  • Integrating High Fidelity Eye, Head and World Tracking in a Wearable Device

Vasha DuTell, Agostino Gibaldi, Giulia Focarelli, Bruno Olshausen, Martin S Banks

School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA

  • Pupil Tracking Under Direct Sunlight

Kamran Binaee, Christian Sinnott, Kaylie Jacleen Capurro, Mark D Lescroart, Paul MacNeilage

Departments of Psychology & Neuroscience, University of Nevada Reno, Reno, Nevada, USA

Talk Session 2: Calibration & Error Mitigation

  • Noise in the Machine: sources of physical and computation error in eye tracking with Pupil Core wearable eye tracker

Anca Velisar, Natela Shanidze

The Smith-Kettlewell Eye Research Institute, San Francisco, CA, USA

  • Solving Parallax Error for 3D Eye Tracking

Agostino Gibaldi, Vasha DuTell, Martin S Banks

School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA

  • Sub-centimeter 3D gaze vector accuracy on real-world tasks: an investigation of eye and motion capture calibration routines

Scott Stone

Department of Psychology, University of Alberta, Edmonton, Alberta, Canada

Panel Discussion: Experience-Based Ideas for Challenges in Head-Free Eye Tracking

  • Panelists:

    • Moritz Kassner, CEO & Co-founder, Pupil Labs Inc.

    • Kai Dierkes, R&D lead, Pupil Labs Inc.

    • Marc Tonsen, Product Lead, Pupil Labs Inc.

    • Marcus Nyström, Research Engineer, Lund University Humanities Lab

    • Brian Sullivan, Senior Research Associate, University of Bristol

    • Vasha DuTell, Ph.D. Candidate, UC Berkeley

    • Topics and Questions:

    • We will gather questions from the audience and the workshop attendees ahead of time and present them to the panelists. Click on the google form and share your own topics of interest and/or questions. This could be a technical challenge or a general question aligned with the workshop topics.

Keynote: Shalini De Mello, Ph.D.

Principal Research Scientist, NVIDIA

Learning Remote Gaze Tracking with Limited Labels or Data

CNNs have greatly improved the accuracy of unconstrained remote gaze tracking systems that use ordinary webcam inputs. However, they require large amounts of diverse data with known gaze labels to learn effectively. Gaze-labeled data can be hard to acquire outside of laboratories, in 360-degrees physically unconstrained settings and during real-time deployment of gaze systems. In this talk, I will describe some of our recent work to address this challenging problem of how to effectively train gaze estimation networks with limited labels or data. I will describe three recent algorithms for training gaze estimation networks with few-shot, semi-supervised and weakly-supervised data that we have developed. I will present a live demonstration of our few-shot gaze estimation algorithm. Lastly, I will conclude the talk with thoughts on open challenges and future avenues of research in the area.

Talk Session 3: Data Analysis

  • VEDBViz: The Visual Experience Database visualization and interaction tool

Sanjana Ramanujam, Christian Sinnott, Bharath Shankar, Savannah Jo Halow, Brian Szekely, Paul MacNeilage, Kamran Binaee

Departments of Psychology & Neuroscience, University of Nevada, Reno, Reno, Nevada, USA

  • Post-processing integration and semi-automated analysis of mobile eye-tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration

Haylie L. Miller*, Ian Raphael Zurutuza**, Nicholas E Fears*, Suleyman Polat**, Rodney D Nielsen**

*School of Kinesiology, University of Michigan, Ann Arbor, MI, USA; **Biomedical Engineering, University of North Texas, Denton, TX, USA

  • Algorithmic gaze classification for mobile eye-tracking

Daniel Müller, David Mann

Department of Human Movement Sciences, Amsterdam Movement Sciences and Institute Brain and Behavior Amsterdam (iBBA), Vrije Universiteit, Amsterdam, Netherlands

  • Characterizing the performance of Deep Neural Networks for eye-tracking

Arnab Biswas, Kamran Binaee, Kaylie Jacleen Capurro, Mark D Lescroart

Departments of Psychology & Neuroscience, University of Nevada, Reno, Reno, Nevada, USA

Talk Session 4: Applications

  • Eye, Robot: Calibration Challenges and Potential Solutions for Wearable Eye Tracking in Individuals with Eccentric Fixation

Kassia Love*, Anca Velisar**, Natela Shanidze**

*Harvard College, Harvard University, Boston, MA, USA; **The Smith-Kettlewell Eye Research Institute, San Francisco, CA, USA

  • Fixational stability as a measure for the recovery of visual function in amblyopia

Avi Aizenman, Dennis Levi

School of Optometry and Vision Science, UC Berkeley, Berkeley, CA, USA

  • Tracking Active Observers in 3D Visuo-Cognitive Tasks

Markus D. Solbach, John K. Tsotsos

York University, Toronto, Ontario, Canada