Research & Data Science Leader | Expert in Product Design, User Advocacy & Methodological Innovation
Mixed Methods Research Leader
Strategic Leadership: Spearheaded the Quantitative Visibility Program at Meta, providing UX Research executives with critical data to advocate effectively for user needs.
Organizational Impact: Drove significant improvements in the quality and visibility of quantitative tracking metrics across large organizations.
Strategic Oversight: Defined and executed the strategy for the working group, assigning clear responsibilities and refining messaging to drive continuous progress.
Team Coaching: Mentored a cross-organizational team to develop and promote a UX Leadership scorecard, successfully championed to direct reports.
Analytic Proficiency: Utilized advanced analytic techniques, including conjoint analysis, maxdiff, and other inferential statistics, to derive actionable insights.
Leadership in Qualitative Research: Served as the leading voice for qualitative research at Nielsen, offering internal consulting across research teams to optimize methodologies.
Expert Moderator: Led qualitative research efforts, including focus groups, in-depth interviews, and user testing, both in-person and remotely.
Tool Proficiency: Experienced with qualitative research tools such as UserZoom and UserTesting, enhancing research flexibility and reach.
Industry and Academic Contributions: Presented at academic conferences and industry events, including delivering tutorials on survey methodologies, showcasing expertise and insights beyond the organization.
Presenting real world qualitative research examples focused on how might we recruit an audience measurement panel. Includes in depth interviews, eye tracking, remote testing, cognitive interviews, and more!
When to use surveys and why basics of survey design writing effective survey questions evaluating survey quality the total survey error framework.
Did you know that measurement doesn’t always require active participation? Yes, traditional survey methods involve asking people questions and logging their responses. But that’s not the only way we measure, and asking questions doesn’t always uncover everything there is to learn.
We examine whether the type of neutral response option offered affects the distribution of responses obtained from questions. Specifically, we experimentally test whether and to what extent two of the most frequently used neutral response options—a “Neither agree nor disagree” neutral response option positioned in the middle of a response scale vs. a “No Opinion” option placed at the end of a response scale—yield consistent results and in what ways they differ.
During Q2-Q3 2020, Nielsen rapidly designed a mixed mode recruitment methodology to supplement in-person methodology due to COVID-19 pandemic. Participant feedback on materials design includes, “It’s beautiful. You can tell there’s a lot of thought and energy that went into it so I would definitely open it.”
Recording of presentation.
For a long time the media industry has relied on delivering impressions as a way to define value. But now we see so many different impressions in rapid succession we have a new problem. Are all impressions really equal? How do we know that those impressions resonate? How do we know they create the desired outcome? Given this phenomenon of competing impressions, is it time for attention to become a more formal part of our measurement framework?
ASI 2018 session description.
You can reach us by Smartphone – You can reach us by Tablet – Just reach us if you can: Exploring Factors Associated with Respondent Mode Choice for Surveys Using Mobile Devices; This presentation contains advanced statistical and survey work using conjoint analysis and maxdiff survey approaches to understand wiliness to participate in survey panel measurement.