Current Projects

This page gives more details about some of my current research projects. Scroll down or click the links for more info.

Eye movements in the perception of scenes and objects

Memory for pictures

Biases in natural vision

Social attention

Attention in visual media

Attention in the real world

Eye movements in the perception of scenes and objects

Much of my research to date concerns the control of eye movements and attention during scene perception. Where do we look when inspecting a picture? Answering this question requires investigating the visual features of a scene which might attract attention automatically, as well as the way in which people's understanding of their environment guides viewing. In several experiments I have shown that bottom-up visual saliency co-occurs with fixation (i.e. people look at bright things) when they have no particular task, but that when participants are given a realistic task they can efficiently guide their attention to important objects irrespective of saliency. I am also working with colleagues in computer science, using eye movements to compare human and computer feature detection.

Example publications:

Foulsham, T. & Underwood, G. (2007). How does the purpose of inspection influence the potency of visual salience in scene perception? Perception, 36, 1123-1138.

Foulsham, T., Alan, R. & Kingstone, A. (2011). Scrambled eyes? Disrupting scene structure impedes focal processing and increases bottom-up guidance. Attention, Perception and Psychophysics, 73 (7), 2008-2025.

Foulsham, T. (2015). Scene Perception. In J. M. Fawcett, E. F. Risko, & A. Kingstone (Eds.),The Handbook of Attention. MIT Press.

Memory for pictures

Our ability to recognise images, and to remember scenes that we have seen previously is surprisingly good, especially considering the fact that we move our eyes around all the time. In several projects I have tested people's memory for pictures, with the aim of relating memory recognition performance to visual attention. During my PhD I confirmed an older finding that people seem to "retrace their steps" by looking at the same parts of a picture when they see it again. Subsequent work has focused on methods to quantify the similarity between eye movement scanpaths made during multiple viewings of a picture, and I am currently investigating the links between attention and memory in this context.

**Check out work on scanpaths and memory here.**

Example publications:

Foulsham, T. & Underwood, G. (2008). What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition. Journal of Vision, 8 (2), 6, 1-17.

Underwood, G., Humphrey, K. & Foulsham, T. (2009). Saliency and scan patterns in the inspection of real-world scenes. Visual Cognition, 17 (6), 812-834.

Saccade biases in natural vision

Humans make very fast directed eye movements ("saccades") about 3 times a second to direct the eyes to points of interest. Far from being random, these saccades show consistent patterns in position, direction and amplitude (length). For example, people make more horizontal saccades than vertical ones, even when the visual information is equally rich in each direction. In several papers I have explored these biases by manipulating the image being looked at or the task, and by modelling the results (see modelling page here).

Example publications:

Foulsham, T. & Kingstone, A. (2010). Asymmetries in the direction of saccades during perception of scenes and fractals: Effects of image type and image features. Vision Research, 50(8), 779-795.

Foulsham, T., Kingstone, A. & Underwood, G. (2008). Turning the world around: patterns in saccade direction vary with picture orientation. Vision Research, 48, 1777–1790.

Social attention

In the past few decades, psychologists have begun to study how we pay attention to other people and their social signals (particularly eye direction). For example, a pair of eyes pointing in one direction automatically guides attention to that location. However, most of this research is restricted to artificial laboratory paradigms which may not reflect real social meaning. I investigate this by looking at patterns of attention during videos of social interaction and when people are actually interacting in the real world. In fact, many of the defining features of a "social" situation (e.g. interaction within a group) are lacking from most cognitive research and I am uncovering the effects of this rich context, in collaboration with social and developmental psychologists.

Example publications:

Foulsham, T., Cheng, J.T., Tracy, J.L., Henrich, J. & Kingstone, A. (2010). Gaze allocation in a dynamic situation: Effects of social status and speaking. Cognition, 117, 319-331.

Laidlaw, K.E.W., Foulsham, T., Kuhn, G. & Kingstone, A. (2011). Potential social interactions are important to social attention. Proceedings of the National Academy of Sciences, 108 (14), 5548-5553.

Foulsham, T., & Lock, M. (2015). How the Eyes Tell Lies: Social Gaze During a Preference Task. Cognitive Science, 39(7), 1704-1726.

Attention in visual media

Eyetracking provides a set of powerful techniques to find out about information processing. In much of my work I am interested in applying these techniques to different (visual) media. Although there is considerable theoretical work investigating how people read text or look at simple images, there is far less investigating how people explore multimedia or interactive media. I have studied attention in film, where people look in comics, and how people browse websites. In all of these cases we can investigate what attracts attention and how the viewer integrates different types of information.

Example publications:

Foulsham, T., Wybrow, D. P., & Cohn, N. (2016). Reading Without Words: Eye Movements in the Comprehension of Comic Strips. Applied Cognitive Psychology, 30(4), 566-579.

Kao, G., Chiang, X., & Foulsham, T. Reading behavior and the effect of embedded selfies in role-playing picture e-books: An eye-tracking investigation.

Attention in the real world

Although cognitive psychologists like to talk about the real world, we often use constrained situations and stimuli in our experiments. I have become interested in investigating vision and attention while people are free to move around and perform complex tasks. In particular, using mobile cameras and eye trackers I am recording where people look when walking around, buying coffee and navigating. I am also interested in applying this work alongside engineers and practitioners who care about visual interaction within an environment. This has led to projects looking at attention in many different applied contexts (sports, surveys, wayfinding...).

Example publication:

Foulsham, T., Walker, E. & Kingstone, A. (2011). The where, what and when of gaze allocation in the lab and the natural environment. Vision Research, 51 (17), 1920-1931.

Foulsham, T. (2015). Eye movements and their functions in everyday tasks. Eye, 29, 196- 199.