Drivers often zone out when using partial automation (think Tesla Autopilot or GM Super Cruise). This "mind-off" state leads to delayed reactions or missed takeovers in critical moments. While some cars alert drivers when they need to act, these systems don’t always work well — especially if the driver is distracted.
I set out to explore the effectiveness of multimodal prompts—short, on-screen visual messages paired with a brief audio cue—in supporting driver engagement during partially automated driving. Specifically, I asked:
Can these prompts help drivers stay cognitively engaged when they’re not actively driving?
Do they still work when drivers are occupied with other tasks, like using their phone or solving a mental puzzle?
Which type of prompt is more effective—casual, everyday small talk or direct, driving-related questions?
To test these questions, I designed two full-length experiments in a driving simulator, each reflecting different real-world situations.
✅ Method at a Glance
Tool: STISIM driving simulator with steering, pedals, and eye tracking.
Participants: 48 licensed drivers (24 per experiment).
Design: Each participant completed 6 drives under different conditions.
Prompt types:
No prompts
Driving-related questions (e.g., “Is there a curve ahead?”)
Everyday prompts (e.g., “Have you had breakfast today?”)
Metrics:
Response time to take over driving
Crash rate
Steering behavior (hands-on-wheel)
Distraction level (non-driving task involvement)
The experiment setup includes a driving simulator with 3 monitors, 4 video cameras, and an eye-tracker.
I ran two versions of the experiment:
Experiment 1 – Light Distraction
Participants could choose to engage in side activities (like checking their phone or chatting), just as they might in a real car. Prompts appeared about once a minute.
Experiment 2 – Heavy Distraction
Participants were given tricky anagram puzzles with a cash reward. They had to balance solving puzzles with monitoring the road—mimicking situations where drivers are really absorbed in something else.
✅ When prompts work best:
Driving-related prompts helped drivers react faster and reduced crash rates—but only when distractions were mild (Experiment 1).
These prompts nudged drivers to keep an eye on the road and reduced the time they spent doing other tasks.
🚫 When prompts don’t cut it:
In the high-distraction scenario (Experiment 2), even driving-related prompts weren’t strong enough to re-engage drivers.
When the side task was engaging and rewarding, prompts lost their power.
📉 The "aha" insight:
Engagement tools need to match the cognitive load of the situation. Light reminders help when people are bored—but fail when people are fully absorbed elsewhere.
This research offers actionable guidance for:
Automotive UX teams looking to build better in-vehicle prompts
Designing safer automation systems by balancing driver attention and notification timing
Voice UI design for in-car assistants, by highlighting the value of relevant, driving-focused content over generic small talk
It also raised critical questions for future design:
Could prompts be gamified to be as engaging as the tasks they compete with?
How do we personalize prompts based on a driver’s mental state or behavior?
Participants shared that driving-related prompts made them feel more focused and safe:
"The driving prompts made me feel more alert and in control."
Interestingly, some preferred even the casual conversation prompts because they helped keep them awake or entertained—suggesting future prompts might benefit from smart personalization.
Context matters. Prompts help, but only when drivers aren’t too distracted.
Driving-related content is key. It pulls attention back to the road more effectively than casual prompts.
Design for cognitive load. Prompt systems should adapt to what drivers are doing—whether they’re cruising or deep into a task.
One-size-fits-all doesn’t work. Engagement tools need to be flexible, timely, and situationally aware.
This project challenged assumptions about how to design for driver engagement and has informed new thinking around adaptive driver alerts and voice interactions in vehicles.
🗞 As Seen In
This study was featured in NC State News:
🔗 Prompts Could Help Make Partially Automated Cars Safer (NC State University News)
If you’re interested in the research behind this case study, here are my related publications:
Bai, X., & Feng, J. (2025). Awaking the Disengaged: Can Driving-Related Prompts Engage Drivers in Partial Automation? Human Factors. [Read the publication]
Bai, X., & Feng, J. (2024). Capturing the Mind: Non-Driving-Related Tasks as a Window into Cognitive Engagement in Automated Driving. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. [Read the publication]
Bai, X., & Feng, J. (2023). Unlocking safer driving: how answering questions help takeovers in partially automated driving. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. [Read the publication]