AZCALL 2020 

Virtual Conference

Item Preview in Listening EAP   

Zachary Meyer; Northern Arizona University 

Rebecca Yeager; University of Iowa

Biographies

Zach is currently a lecturer in the Program in Intensive English at Northern Arizona University. He has also worked in ESL/EAP at the University of Iowa and Indiana University. He has a BA in Spanish and an MA in Second Language Studies from Indiana University. 

Rebecca Yeager is an Associate Professor of Instruction at the University of Iowa.  


*Ask questions and comment below


Abstract

This study investigates the effects of adding item preview to an English for Academic Purposes (EAP) multiple choice listening assessment. Item preview means that test takers can see the questions (but not the response options) while listening to a recorded lecture. Item preview has purported benefits, including reducing student anxiety (Sherman 1997; Chang 2005; Iimura 2010), reducing the effect of working memory on a student’s score (Kim 2015), and providing context when testing unfamiliar content (Buck 2001). However, it also has some potential drawbacks, including changing the definition and measurement of academic listening ability (Koyama, Sun, and Ockey, 2016; Yanagawa and Green, 2008) and changing students’ perception of and preparation for academic listening tasks. Our study administered 2 recorded lectures with 9 multiple choice questions each to 8 intact classes of matriculated students enrolled in ESL Listening courses at a large Midwestern university. Group 1 (four classes) experienced Lecture A without item preview and Lecture B with it, while Group 2 (the other four classes) experienced the reverse. Each of the 9 items in a lecture targeted one of 3 types of comprehension: global (main ideas and inferences), local (important details), or trivial (unimportant details). We analyzed the effects of item-preview and item type on item facility through a step-wise model building process. The best fitting model included item-preview condition and item type as significant main effects and class as a random effect. Contrasts revealed that including item preview increased the odds of a student getting an item correct by 44%, (95% CI [6%, 96%]). Contrasts also revealed that local items had 142% (95% CI [66%, 254%]) higher odds of a correct response than global items. Similarly, local items had 109% (95% CI [21%, 262%]) higher odds of a correct response than trivial items. There did not appear to be a detectable difference between global and trivial items. We also found that item-preview increased the average total score on a test by 11% (95% CI [less than 1%, 23%]), compared to no-item-preview. 

Meyer & Yeager - Comments and Questions (Responses)