Publications

Addleman, D. A., & Störmer, V. S. (2023). Distractor ignoring is as effective as target enhancement when incidentally learned but not when explicitly cued. Attention, Perception, & Psychophysics. https://doi.org/10.3758/s13414-022-02588-y. PDF. Open Materials.

Abstract. Explicit knowledge about upcoming target or distractor features can increase performance in tasks like visual search. However, explicit distractor cues generally result in smaller performance  benefits than target cues, suggesting that suppressing irrelevant information is less effective than enhancing relevant information. Is this asymmetry a general principle of feature-based attention? Across four experiments (N=75 each) we compared the efficiency of target selection and distractor ignoring through either incidental experience or explicit instructions. Participants searched for an orientation-defined target amidst seven distractors – three in the target color and  four in another color. In Experiment 1, either targets (Exp. 1a) or distractors (Exp. 1b) were presented more often in a specific color than other possible search colors. Response times showed comparable benefits of learned attention towards (Exp. 1a) and away from (Exp. 1b) the  frequent color, suggesting that learned target selection and distractor ignoring can be equally effective. In Experiment 2, participants completed a nearly identical task, only with explicit cues to the target (Exp. 2a) or distractor color (Exp. 2b), inducing voluntary attention. Both target and distractor cues were beneficial for search performance, but distractor cues much less so than target cues, consistent with previous results. Cross-experiment analyses verified that the relative  inefficiency of distractor ignoring versus target selection is a unique characteristic of voluntary attention that is not shared by incidentally learned attention, pointing to dissociable mechanisms of voluntary and learned attention to support distractor ignoring.

Wöstmann, M., Störmer, V.S., Obleser, J., Addleman, D. A., Andersen, S., Gaspelin, N., Geng, J., Luck, S., Noonan, M., Slagter, H., & Theeuwes, J. (2022). 10 simple rules to study distractor suppression. Progress in Neurobiology. https://doi.org/10.1016/j.pneurobio.2022.102269. PDF.

Abstract. Distractor suppression refers to the ability to filter out distracting and task-irrelevant information. Distractor suppression is essential for survival and considered a key aspect of selective attention. Despite the recent and rapidly evolving literature on distractor suppression, we still know little about how the brain suppresses distracting information. What limits progress is that we lack mutually agreed upon principles of how to study the neural basis of distractor suppression and its manifestation in behavior. Here, we offer ten simple rules that we believe are fundamental when investigating distractor suppression. We provide guidelines on how to design conclusive experiments on distractor suppression (Rules 1–3), discuss different types of distractor suppression that need to be distinguished (Rules 4–6), and provide an overview of models of distractor suppression and considerations of how to evaluate distractor suppression statistically (Rules 7–10). Together, these rules provide a concise and comprehensive synopsis of promising advances in the field of distractor suppression. Following these rules will propel research on distractor suppression in important ways, not only by highlighting prominent issues to both new and more advanced researchers in the field, but also by facilitating communication between sub-disciplines.

Xiong, Y., Addleman, D. A., Nguyen, N. A., Nelson, P., & Legge, G. L. (2022). Visual and auditory spatial localization in younger and older adults. Frontiers in Aging Neuroscience. https://doi.org/10.3389/fnagi.2022.838194. PDF.

Abstract. Visual and auditory localization abilities are crucial in real-life tasks such as navigation and social interaction. Aging is frequently accompanied by vision and hearing loss, affecting spatial localization. The purpose of the current study is to elucidate the effect of typical aging on spatial localization and to establish a baseline for older individuals with pathological sensory impairment. Using a verbal report paradigm, we investigated how typical aging affects visual and auditory localization performance, the reliance on vision during sound localization, and sensory integration strategies when localizing audiovisual targets. Fifteen younger adults (N = 15, mean age = 26 yrs) and thirteen older adults (N = 13, mean age = 68 yrs) participated in this study, all with age-adjusted normal vision and hearing based on clinical standards. There were significant localization differences between younger and older adults, with the older group missing peripheral visual stimuli at significantly higher rates, localizing central stimuli as more peripheral, and being less precise in localizing sounds from central locations when compared to younger subjects. Both groups localized auditory targets better when the test space was visible compared to auditory localization when blindfolded. The two groups also exhibited similar patterns of audiovisual integration, showing optimal integration in central locations that was consistent with a Maximum-Likelihood Estimation model, but non-optimal integration in peripheral locations. These findings suggest that, despite the age-related changes in auditory and visual localization, the interactions between vision and hearing are largely preserved in older individuals without pathological sensory impairments.

Addleman, D. A., & Störmer, V. S. (2022). No evidence for proactive suppression of explicitly cued distractor features.  Psychonomic Bulletin & Review. https://doi.org/10.3758/s13423-022-02071-7. PDF. Open Materials.

Abstract. Visual search benefits from advance knowledge of non-target features. However, it is unknown whether these negatively cued features are suppressed in advance (proactively) or during search (reactively). To test this, we presented color cues varying from trial-to-trial that predicted target or non-target colors. Experiment 1 (N=96) showed that both target and nontarget cues speeded search. To test whether attention proactively modified cued feature representations, in Experiment 2 (N=200), we interleaved color probe trials with search and had participants detect the color of a briefly presented ring that could either match the cued color or not. Interestingly, people detected both positively and negatively cued colors better than other colors, indicating that to-be-attended and to-be-ignored features were both proactively enhanced. These results demonstrate that nontarget features are not suppressed proactively, and instead support reactive accounts in which anticipated nontarget features are ignored via strategic enhancement.

Addleman, D. A., & Lee, V. G. (2022). Simulated central vision loss does not impair implicit location probability learning when participants search through simple displays. Attention, Perception, & Psychophysics. https://doi.org/10.3758/s13414-021-02416-9. PDF. Open Materials.

Abstract. Central vision loss disrupts voluntary shifts of spatial attention during visual search. Recently, we reported that a simulated scotoma impaired implicit spatial attention towards regions likely to contain search targets. In that task, search items were overlaid on natural scenes. Because natural scenes can induce explicit awareness of learned biases leading to voluntary shifts of attention, here we used a search display with a blank background less likely to induce awareness of target location probabilities. Participants searched both with and without a simulated central scotoma: a training phase contained targets more often in one screen quadrant and a testing phase contained targets equally often in all quadrants. In Experiment 1, training used no scotoma, while testing alternated between blocks of scotoma and no-scotoma search. Experiment 2 training included the scotoma and testing again alternated between scotoma and no-scotoma search. Response times and saccadic behaviors in both experiments showed attentional biases towards the high-probability target quadrant during scotoma and no-scotoma search. Whereas simulated central vision loss impairs implicitly learned spatial attention in the context of natural scenes, our results show that this may not arise from impairments to the basic mechanisms of attentional learning indexed by visual search tasks without scenes.

Addleman, D. A., Legge, G. L., & Jiang, Y. V. (2021). Simulated central vision loss impairs implicit location probability learning. Cortex. https://doi.org/10.1016/j.cortex.2021.02.009. PDF. Open Materials. 

Abstract. Some eye diseases, especially macular degeneration, can cause central vision loss (CVL), impairing goal-driven guidance of attention. Does CVL also affect implicit, experience-driven attention? We investigated how simulated central scotomas affected young adults' ability to prioritize locations frequently containing visual search targets (location probability learning). Participants searched among distractor letter ‘L's for a target ‘T’ that appeared more often in one screen quadrant than others. To dissociate potential impairments to statistical learning of target locations and attentional guidance, two experiments each included search with and without simulated scotomas. Experiment 1 successfully induced probability learning in a no-scotoma phase. When participants later searched both with and without simulated scotomas, they showed persistent, statistically equivalent spatial biases in both no-scotoma and scotoma search. Experiment 2 trained participants with a central scotoma. While Experiment 1's participants acquired probability learning regardless of their self-reported awareness of the target's location probability, in Experiment 2 only aware participants learned to bias attention to the high probability region. Similarly, learning with a scotoma affected search with no scotoma in aware but not unaware participants. Together, these results show that simulated central vision loss interferes with the acquisition of implicitly learned location probability learning, supporting a role of central vision in implicit spatial attentional biases.

Addleman, D. A., & Jiang, Y. V. (2019). Experience-driven auditory attention. Trends in Cognitive Sciences. https://doi.org/10.1016/j.tics.2019.08.002. PDF.

Abstract. In addition to conscious goals and stimulus salience, an observer’s prior experience also influences selective attention. Early studies demonstrated experience-driven effects on attention mainly in the visual modality, but increasing evidence shows that experience drives auditory selection as well. We review evidence for a multiple-levels framework of auditory attention, in which experience-driven attention relies on mechanisms that acquire control settings and mechanisms that guide attention towards selected stimuli. Mechanisms of acquisition include cue–target associative learning, reward learning, and sensitivity to prior selection history. Once acquired, implementation of these biases can occur either consciously or unconsciously. Future research should more fully characterize the sources of experience-driven auditory attention and investigate the neural mechanisms used to acquire and implement experience-driven auditory attention.

Addleman, D. A., Schmidt, A., Remington, R. W., & Jiang, Y. V. (2019). Implicit location probability learning does not lead to baseline shifts of visuospatial attention. Psychonomic Bulletin & Review. https://doi.org/10.3758/s13423-019-01588-8. PDF.

Abstract. We tested whether implicit learning causes shifts of spatial attention in advance of or in response to stimulus onset. Participants completed randomly interspersed trials of letter search, which involved reporting the orientation of a T among Ls, and scene search, which involved identifying which of four scenes was from a target category (e.g., forest). In Experiment 1, an initial phase more often contained target letters in one screen quadrant, while the target scenes appeared equally often in all quadrants. Participants persistently prioritized letter targets in the more probable region, but the implicitly learned preference did not affect the unbiased scene task. In Experiment 2, the spatial probabilities of the scene and letter tasks reversed. Participants unaware of the probability manipulation acquired only a spatial bias to scene targets in the more probable region, with no effect on letter search. Instead of recruiting baseline shifts of spatial attention prior to stimulus onset, implicit learning of target probability yields task-dependent shifts of spatial attention following stimulus onset. Such shifts may involve attentional behaviors unique to certain task contexts.

Addleman, D. A., & Jiang, Y. V. (2019). The influence of attentional selection history on auditory spatial attention. Journal of Experimental Psychology: Human Perception and Performance. http://dx.doi.org/10.1037/xhp0000620. PDF.

Abstract. Evidence suggests that prior attentional selection guides visuospatial attention without conscious intent. Yet few studies have examined whether selection history influences auditory spatial attention. Using a novel auditory search task, we investigated two selection history effects: short-term intertrial location priming and long-term location probability learning. Participants reported whether a spoken number, occurring simultaneously with three spoken letter distractors presented from different locations, was odd or even. We first showed that endogenous attention guided by informative arrows facilitated search in our paradigm. Next, intertrial location priming was assessed by comparing reaction time when target location repeated across recent trials to when target location changed. Unlike visual search, auditory search showed little evidence of intertrial location priming. In a separate experiment, we investigated location probability learning by making targets disproportionately likely to appear in one location. Results showed location probability learning: participants were faster when targets occurred in the high-probability location than in the low-probability locations. To our knowledge, this is the first study of intertrial location priming or long-term location probability learning in auditory search. The findings have implications for the role of spatial relevance in auditory attention and suggest that long-term attentional learning and short-term priming rely on separate mechanisms.

Addleman, D. A., Tao, J., Remington, R. W., & Jiang, Y. V. (2018). Explicit goal-driven attention, unlike implicitly learned attention, spreads to secondary tasks. Journal of Experimental Psychology: Human Perception and Performance. http://dx.doi.org/10.1037/xhp0000457. PDF.

Abstract. To what degree does spatial attention for one task spread to all stimuli in the attended region, regardless of task relevance? Most models imply that spatial attention acts through a unitary priority map in a task-general manner. We show that implicit learning, unlike endogenous spatial cuing, can bias spatial attention within one task without biasing attention to a spatially overlapping secondary task. Participants completed a visual search task superimposed on a background containing scenes, which they were told to encode for a later memory task. Experiments 1 and 2 used explicit instructions to bias spatial attention to one region for visual search; Experiment 3 used location probability cuing to implicitly bias spatial attention. In location probability cuing, a target appeared in one region more than others despite participants not being told of this. In all experiments, search performance was better in the cued region than in uncued regions. However, scene memory was better in the cued region only following endogenous guidance, not after implicit biasing of attention. These data support a dual-system view of top-down attention that dissociates goal-driven and implicitly learned attention. Goal-driven attention is task general, amplifying processing of a cued region across tasks, whereas implicit statistical learning is task-specific.