Projects / Publications

How are percepts transformed to memories? I'm interested in the processes by which information in the environment is attended, perceived, and stored into memory. To gain insight into how this happens, I use a combination of methods — behavioral, neuroimaging, and neuropsychological. Lately, my focus has been on high-resolution fMRI studies of the human hippocampus.

I'm currently conducting studies aimed at understanding how attentional modulation of the hippocampus affects the formation of long-term memory. I also have several collaborative projects: With Aaron Bornstein and Sam Feng, I'm investigating how information from perception and memory is dynamically combined to inform decision making. With Janice Chen (and much help from Chris Honey), I'm looking at how predictive signals emerge in the hippocampus, as measured with fMRI and intracranial EEG. Finally, with Natalia Córdova, I'm studying how the hippocampus contributes to the online processing of relational information.

I've listed some research questions below, grouped under different areas. Each question can be partly answered by clicking on it. Convenient, right?


Area 1: Graded and thresholded forms of visual perception

Area 2: Graded and thresholded forms of long-term memory

Area 3: Attentional modulation of hippocampal representations

Area 4: Sleep and episodic memory

Area 5: Methods

In fMRI studies, it's common to look at both the overall level of activity and multivariate measures of the patterns of activity. Often there is meaningful information present in both measures, and we want to know to what extent the information in patterns of activity is separate from information in the mean level of activity.

In the first paper linked to in Area 3, Nick Turk-Browne and I develop a tool called Multivariate-univariate dependence (MUD) analysis, which allows you to look at the relationship between voxels' influence on univariate activity and pattern similarity. Check out the supplement for simulations.