Beyond Looking Time: Decoding Infant Habituation
By: Oliver Panther
Have you ever wondered what’s happening inside a baby’s brain when they see something for the first time… or the fiftieth time? Picture this: a baby watches a colourful toy pop up on a screen. At first, they’re mesmerised. Now, show them the same toy again and again, and something momentous takes place – they begin to lose interest. This is similar to the boredom we, as adults, experience when stuck in traffic and hearing that one song on the radio for the thirtieth time. It's called habituation, and it is one of the earliest signs that a baby’s brain is learning, adapting, and working correctly.
Habituation is a fundamental form of learning. When we see something repeatedly, our brains gradually stop responding to it as strongly. Think of this as freeing up mental space for new, potentially important information. In infants, measuring this process has significant clinical relevance, aiding us in the understanding of neurological conditions such as autism spectrum disorder (ASD), ADHD, deafness, and many more. In fact, reduced habituation in infants as young as 6 months old has been linked to poorer social skills years before a formal diagnosis could be made.
But here’s the catch…we don’t have a reliable way to measure habituation in babies. Currently, research typically focuses on looking time (how long an infant looks at something), assuming that shorter looking time indicates habituation. However, looking time is messy data, and whilst plenty of research has used several other measures of habituation, this leads to a lack of comparability across research. To put it simply, a baby may look away because they’re tired, distracted by their surroundings, or simply in a spectacularly bad mood!
My project set out to answer a deceptively simple question. Can we reliably measure habituation by explaining changes in brain wave activity? Specifically, I focused on something called gamma power (the level of electrical activity in the brain that oscillates at frequencies between 30 Hz and 100 100Hz). Your brain is constantly buzzing with electrical activity. Different types of brain activity happen at different speeds—like different radio stations broadcasting at various frequencies. Gamma power measures one particular type of speedy brain activity! It has been previously suggested that gamma power decreases when infants repeatedly view the same image, suggesting it may be a tell-tale sign of habituation.
Using electroencephalography (EEG), I monitored gamma power in infants aged 10-18 months as they repeatedly viewed colourful geometric shapes on a screen. The idea is simple: if habituation occurs, then gamma power should decline over time as the brain “learns” the stimulus and responds to it less and less.
However, I didn’t just want to show that gamma power decreases… I wanted to quantify exactly how quickly it decreases. To do this, I applied a mathematical model called exponential decay, the same equation used to describe how radioactive materials decay. By fitting gamma power data to this model, I could calculate a precise “decay rate” for each infant. This would essentially tell us how speedily their brain habituated.
The results? Humbling. When I compared gamma power at the beginning of the experiment to that at the end, there was no significant difference. To add insult to injury, the exponential decay model didn’t fit the data well either, falling far short of the threshold for a good fit. My main hypotheses weren’t supported. Now, it should be noted that this could be for a couple of simple reasons. Infant research is gloriously chaotic. Babies don’t cooperate on demand. They rip off electrode caps. They cry. They can even fall asleep… believe me. They’re unpredictable in precisely the ways that make them fascinating to study. Unfortunately, all of these factors led to fewer habituation trials than ideal, and it may be that there just wasn’t enough data from each infant for the model to explain. Essentially, less input = less output.
Yet, every good story has a twist!
Something curious emerged from the data. Some infants in my study received an auditory attention-grabber (a sound) just before the start of each trial to help redirect their focus to the screen. When I analysed these infants separately, the exponential decay model performed significantly better when compared to data from infants who received no sound. This was striking. It suggests that combining auditory and visual information (multisensory integration) may affect trajectories of brain activity. This aligns with past literature, which indicates that when babies receive coordinated information from multiple senses, their neural activity becomes more predictable and organised.
Here’s what’s particularly interesting… the overall amount of gamma power didn’t differ between the two groups, only the pattern changed over time. This challenges traditional theories that focus solely on how much brain activity decreases or increases during habituation. Instead, it suggests that the brain may regulate when and how activity changes, independent of the overall amount of activity. The reason this is important is that it has real implications for understanding infant development and potential early intervention. If multisensory experiences aid in learning, then structured audio-visual activities during early childhood could be beneficial, especially for those with neurological conditions.
My project demonstrates a crucial aspect of the scientific process: even when a model doesn’t work as expected, applying it can reveal hidden patterns in the data. Without fitting gamma power to an exponential decay model, the difference between audio and no-audio conditions might never have surfaced. Sometimes the most valuable discoveries come from our “failed” experiments.
This project has reshaped how I think about scientific questions. Sometimes, the most valuable finding emerges not from confirming what we expected, but from carefully examining why our expectations weren’t met. The infant brain is astonishingly complex, and measuring its activity requires methods as sophisticated as the phenomena we’re trying to understand.
Disconnection and Panic Symptoms in Teenagers
By: Lottie Shipp
Imagine that, out of the blue, your heart starts pounding. You’re struggling to breathe, and you feel detached from your body and surroundings. You feel anxious and you worry that you’re losing control.
This a description of a panic attack - a sudden and overwhelming surge of fear with both physical and psychological symptoms. They’re common in young people – in fact, over 60% experience at least one panic episode each year. And for the 3.4% of 17-19 year olds who could be diagnosed with panic disorder, these attacks happen often and can make it difficult for them to do everyday activities like going to school or taking public transport.
But despite being common and distressing, we still don’t know much about panic in young people. In particular, we know very little about how panic symptoms are affected by or related to an experience called dissociation. This is the feeling of being disconnected from your body or the outside word, and it’s often experienced by young people with many panic symptoms or panic disorder. Some people describe dissociation as like being in a dream, or feeling as though they’re not really there. Many of us experience mild feelings of dissociation from time to time, but when it happens more often or strongly, it can lead to psychological difficulties.
There are several types of dissociation. In our study, we focused on two types. The first was depersonalisation - the feeling of being disconnected from your body. The second was ‘felt sense of anomaly’ in which you feel detached from your body or environment, or as though the world around you is unfamiliar or unreal.
In our project, we wanted to know about how dissociation might be related to panic symptoms in young people. So, we ran an online survey in which teenagers reported their thoughts, feelings, and behaviours.
We found that people with higher levels of depersonalisation and felt sense of anomaly dissociation said that they had more severe panic symptoms. This link was partly explained by how teenagers managed their emotions. We all use many different methods of regulating how we feel, but we looked at just two different techniques, called cognitive reappraisal and expressive suppression.
Cognitive reappraisal is the process of re-interpreting a situation in a way that makes it less upsetting. Our results showed that young people who experienced more feelings of dissociation were less likely to manage their emotions by changing their interpretations of difficult situations, and they were more likely to hide or suppress their feelings (expressive suppression). They were also more likely to have higher levels of panic symptoms.
These results show that teenagers who often feel dissociated are more likely to experience panic. This is partly because they find it hard to regulate their emotions. Rather than using healthy coping strategies like re-interpretating situations, they often bottle up or hide what they’re feeling inside.
These findings help us to understand some of the thoughts, feelings, and experiences related to panic symptoms in teenagers. We would need to run further studies before suggesting treatments – but our findings indicate that feelings of panic and dissociation might be improved if young people are taught better ways of managing their feelings.
When Experiences of Touch Outweigh Vision
By: Caitlin Naylor
Does the person sitting next to you see the same landscape, hear the same noises, or feel the same ground beneath their feet? Or do we see the world in a way that fits what we know, confirms our bias, and conforms to our pre-existing ideas? These are the questions I had on my mind as I began my research.
Our perception of the world is vital to every aspect of life: it affects how we choose to move, how we interact with objects, and how we connect to other people. That’s why it’s key to understand how we use our previous experiences to interpret the information we receive from our senses and use this to understand our world. Perceptual illusions have been a valuable tool for investigating this effect: they allow us to trick the brain into misinterpreting information in the environment by contradicting our expectations. A famous and reliable example is the material-weight illusion. When people lift an object that appears to be made from typically lightweight material (e.g. polystyrene), it feels much heavier than an object of the same weight that appears to be made from a typically heavyweight material (e.g. granite). This illusion occurs because we expect certain materials to be heavier than others. So, because both objects have equal weights, the granite object feels lighter than you expect, so you judge it as lighter. This illusion shows how our brain uses our prior experiences with materials to form what we perceive, even overriding the sensory information it receives.
So, we know that perception is formed by using information from our senses and using our expectations from prior experiences. What’s less clear is whether the exact sense through which information is received, for instance, whether the information came from vision or touch, can impact how our expectations influence perception. This is the question I aimed to answer.
To do this, I recreated the material weight illusion in virtual reality. Volunteers lifted objects of the same size and weight but with different outer materials. They then reported how heavy the objects felt. Whilst lifting these physical objects, the volunteers viewed copies of the objects in virtual reality, which moved in time with the physical objects. So, people could touch real objects, but I could manipulate what the object looked like in virtual reality. I created a mismatch between the material of the physical object being touched and the virtual material being seen. This meant that the predicted weight of the object based on the felt material would contradict the prediction based on the visual material of the object. Consequently, I could distinguish whether the final judgement of object heaviness was based on expectations from visual or touch cues.
Figure 1. Experiment set up
Results showed that the material weight illusion was consistently experienced when lifting physical objects and seeing them in virtual reality. Interestingly, the strongest illusion was seen when people relied on material cues from touch. The physical polystyrene object was judged to be much heavier than the physical granite object, compared to when these materials were only presented through vision (in virtual reality). This reliance on touch happened even when the visual materials directly conflicted with the physical materials. For example, when the physical polystyrene object looked like it was made of granite, participants still behaved as if it were made of polystyrene. This effect shows that information from touch completely overrode visual cues - so in the context of judging heaviness, the brain decided that touch was more reliable.
But what does this mean for our big question? Does the exact sense through which information is received impact how our expectations influence perception? Well, in short, yes! When expectations about object weight were based on touch cues, this had a greater influence on perceptions of heaviness than when visual material cues triggered those same expectations. So, although our expectations can shape our experiences, our senses can alter this effect, and consequently our senses still play a vital role in creating our perception of the world.
Oliver Panther, Keele University (2026) - 'Quantifying visual habituation through gamma power decay: An EEG study modelling exponential decay to enhance neural biomarker reliability'; working title - 'Beyond Looking Time: Decoding Infant Habituation'
Akilles Rechardt, University College London (2025) - 'Large Language Models and human experts in assessing the plausibility of neuroscientific results'
Didem Yurdakul, University of Aberdeen (2024) - 'The Effects of RepeatedLineups on Accuracy and Choosing'
Lottie Shipp, University of Oxford (2023) - 'Developing an Understanding of Depersonalisation in Adolescents'
Caitlin Naylor, University of Bath (2022) - 'Modality mediates top-down perception: Presentation of material through vision or touch influences the extent to which expectations shape perception of heaviness.'
Jessica Teed, University of Leeds (2021) - 'Exploring temporal dynamics of facial expressions: Early categorisation confusions do not indicate shared evolutionary function.'
Lenard Dome, University of Plymouth (2020) - 'Clearing confounds from the inverse base-rate effect: Irrationality and concurrent load.'
Gwydion Williams, University College London (2019 co-winner) - 'Detachment from external influence.'
Jacob Lagerros, Oxford University and University of Manchester (2019 co-winner) - 'Taking Decisions by (De)composing World Models.'
Irena Arslanova, City University of London (2018) - 'Searching for bodies: Electrophysiological evidence for independent somatosensory processing during attentional selection of body postures.'
Robert Jagiello, University of Warwick (2017) - 'Social risk amplification in computer-mediated diffusion chains: Effectiveness of information reactivation applied to risk taxonomy.'
Amy Isham, University of Warwick (2016) - 'Applying the distractor devaluation effect to online impulse buying.'
Zoe Lewis, University of Hull (2015) - 'Illusory ownership over an artificial arm decreases itch perception in the real arm.'
Punit Shah, University of Surrey (2014) - 'A pessimistic view of optimistic belief updating.'
Amy Gibb, Newcastle University (2013) - 'Defining the lower limit of human pitch perception.'
Samantha Mansell, University of Oxford (2012) - 'Why we can’t see the zoo for the animals: The A to Z of inattentional blindness.'
Adele Goman, University of York (2011) - 'Perception of vocal emotion in speech: simulations of bilateral and bimodal cochlear implantation.'
Samantha Wilkinson, Lancaster University (2010) - 'The effects of biasing story contexts on 7 and 11 years olds’ false recognition compared to recognition on standard DRM lists.'
Christel Gudberg, Royal Holloway University of London (2009) - 'Are phosphenes reliable measures of conduction time in the visual system?'
Paul Briley, University of York (2008) - 'The ability to use movement for segregating and locating sources of sound.'
Hui Minn Chan, University of York (2006) - 'A study of spatial learning and memory in the red-footed tortoise (Geochelone carbonaria).'
Anna Wollaston, Royal Holloway University of London (2005) - 'The effects of masked priming on alphabetic retrieval and letter naming.'
Rebecca Jones, Birkbeck University of London (2004) - 'An attentional blink for fearful faces: Emotional processing does not require attention'
Sophie George, University of Sussex (2003) - 'The acute effect of alcohol on decision making in social drinkers.'
Anna Collins, University of York (2002) - 'The role of the syllable in spoken word recognition: The illusory migration paradigm.'
Marisa Taylor Clarke, University College London (2001) - 'Vision enhances cortical processing of tactile stimuli: An evoked potential study.'
Nicholas Holmes, University of Manchester (2000 co-winner) - 'Hemispheric asymmetries in perceptual-motor processing and the space ship plot: Simple reaction time to lateralised sinusoidal gratings.'
Shannon Connaire, Birkbeck College University of London (2000 co-winner) - 'Children with grammatical SLI: The use of syntactic and semantic cues in word-learning.'
Donna Lloyd, University of Manchester (1999) - 'Crossmodal links in covert endogenous attention between audition and touch.)
Carmel Price, Royal Holloway, London (1998) - 'The co-ordination of verbal and spatial immediate memory in reading and mathematical ability.'
Peggy Postma, City University (1997) - 'Developmental prosopagnosia: should it be taken at face value?'
Nicholas Yeung, University of Oxford (1996) - 'The effect of practice on task switching and Stroop interference.'
Annabel Thorn, University of Bristol (1995) - 'The wordlikeness effect in non-word repetition: An index of bilingual type?'
Sarah Swash, University of York (1994) - 'Contrast induced speed misperception.'
Christine Askew, University of Liverpool (1993) - 'Implicit memory for print advertising material.'