Visual Perception & Attention Lab

How do humans perceive the world? 

Machines use algorithms to interpret and present information, but humans use heuristics. Our eyes don't act like cameras and our brains aren't computers. Whereas modern image processing techniques allow for virtually every pixel in an image to be scanned and perfectly re-rendered in parallel, we can't process more than a fraction of incoming sensory information in each glance. So, why do we feel like we also experience the world as stable and complete, in full HD? 

Work in our lab focuses on uncovering clever visual and attentional strategies the human visual system uses to rapidly make sense of the surrounding environment, and how such heuristics and biases can be leveraged in cutting-edge methods to optimize and predict human behavior. We use psychophysical, computational, eye tracking, and neurophysiological measures to conduct basic scientific experiments that enable us to draw novel and insightful conclusions about human perception. 

NEWS  AND UPCOMING EVENTS      

New Preprint: Y-axis inversion in gaming 

Find us:  

MIT CSAIL (Stata Center)

32 Vassar St, 32-D410

Cambridge, MA 02139

Visual Perception & Attention Lab (VPAL)

Map & directions to our lab


Contact us: vpal.mit@gmail.com