Emily Morgan

About me

I am an Assistant Professor in the Department of Linguistics at UC Davis.

Pronouns: she/her/hers

Email: eimorgan@ucdavis.edu

Visit the Morgan Lab website!

Download CV

My research

To know a language is to use one’s past linguistic experience to form expectations about future linguistic experience. This process is mediated by both speakers’ stored representations of their previous experience, and the online procedures used to process new stimuli in light of those representations. My research thus asks what the form of these representations is, and how the language processing system integrates these stored representations with incoming stimuli to form online expectations during language comprehension. For example, when one encounters a highly frequent phrase such as “bread and butter”, is this phrase represented and processed holistically as a single unit, or compositionally as a conjunction of nouns? Is the form of this representation influenced by the frequency of the expression (compared to a less frequent expression like “facts and techniques”) or its frozenness in a given order (compared to a more flexible expression like “boys and girls”/”girls and boys”)? To answer these questions, I combine experimental psycho- and neurolinguistic methods, such as eye-tracking and ERPs, with probabilistic computational modeling.

I also ask comparable questions in other domains, specifically programming languages and music. Programming shares a lot of terminology with natural language (e.g. programming languages, coding literacy), but we don't yet know much about how much the cognitive processes involved in learning and using programming language overlap with those used in natural language. For example, do we see the same sort of predictability effects in reading and writing code as we do in natural language? Likewise in music, to what extent does processing of melodies rely upon language-like hierarchical structure versus surface statistics (e.g. note to note transition probabilities)? These questions are of interest in their own right, and additionally provide interesting comparison cases with natural language.

News

I will be teaching Computational Models of Generalization and Item Specificity with Dr. Masoud Jasbi at the 2023 LSA Summer Institute at UMass Amherst!

Visit the Morgan Lab website for up-to-date news about our publications, conference presentations, grants and awards, etc.!