Augmented Reality
Augmented reality (AR) has the potential to fundamentally change the way we interact with information. Direct perception of computer-generated information overlaid atop physical reality can afford hands-free access to contextual information on the fly. However, as users must interact with both virtual and real objects simultaneously, yesterday’s approaches to interface design are insufficient to support the new way of interaction. Furthermore, the impact of this novel technology on user experience and performance are not yet fully understood. To realize the full potential of this novel technology, we explore new methods for the requirements analysis, design, prototyping, and evaluation of augmented reality applications with an emphasis on human-environment interaction mediated by interfaces.
Methodology: Quantifying human depth perception in augmented reality
Volumetric optical see-through displays illuminate voxels, 3D equivalents of pixels in traditional fixed-focal plane displays to create virtual objects within a 3D volume. In this study, we quantified participant performance matching virtual objects to real-world counterparts while using both volumetric and fixed-focal plane AR displays. Results showed the volumetric displays to be associated with faster and more accurate depth judgments, and that participants performed depth judgments more quickly as the experiment progressed.
Related projects/articles (click the links to see details)
Effects of Volumetric Augmented Reality Displays on Human Depth Judgments. International Journal of Mobile Human Computer Interaction (IJMHCI), 11(2), 1-18. Lisle, L., Merenda, C., Tanous, K., Kim, H., Gabbard, J. L., & Bowman, D. A. (2019).
Effect of Volumetric Displays on Depth Perception in Augmented Reality. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 155-163). ACM. Acceptance Rate=34%. Lisle, L., Tanous, K., Kim, H., Gabbard, J. L., & Bowman, D. A. (2018).
Methodology: Quantifying situation awareness in augmented reality
This project aimed to develop a new framework for quantifying both the positive and negative consequences of AR display on driver cognitive processes by measuring eye glance behavior, situation awareness, confidence, and workload. To quantify how informative AR displays are, we examined the depth of driver knowledge about the environmental elements cued by the display (e.g., pedestrians). To quantify how distractive AR displays are, we investigated the breadth of driver knowledge about the environmental elements un-cued by the display (e.g., other vehicles, landmarks, and traffic signs).
Related projects/articles (click the links to see details)
Assessing Distraction Potential of Augmented Reality Head-Up Displays for Vehicle Drivers. Human factors, 64 (5), 852-865, Kim, H., & Gabbard, J. L. (2022)
User study: Augmented Reality Order Picking Aid for Foreign Workers in Warehouses
This study aims to evaluate an AR HMD system against traditional methods, focusing on its potential to aid non-native English-speaking warehouse workers and boost efficiency and accuracy in picking tasks. Our goal is to ascertain whether an AR aid system, utilizing universal and conformal design principles, can yield superior results in user performance, usability, and situational awareness compared to written instructions.
Related projects/articles (click the links to see details)
Augmented Reality Order Picking Aid for Foreign Workers in Warehouses. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Online First). Albawaneh, A., Venkatesha Murthy, S., Singla, G., Wu, J., & Kim, H. (2023).
User study: Test-track usability evaluation of AR driver assistant systems
Usability evaluation in ecologically valid settings (e.g., realistic work environment and task scenarios) beyond laboratory experimentation is the key to the success of real-world applications of augmented reality. This project examined driver behavior, performance, and experience with advanced driver assistant systems using an AR head-up display in a test track where 24 participants were asked to manage the actual demands of driving in controlled but realistic driving scenarios.
Sponsors/Collaborators: Honda Research Institute USA Inc.
Related projects/articles (click the links to see details)
Augmented Reality Interface Design Approaches for Goal-directed and Stimulus-driven Driving Tasks. IEEE Transactions on Visualization and Computer Graphics, 24(11), 2875- 2885., Presented at the IEEE ISMAR 2018, Best Paper Award Honorable Mention. Merenda, C., Kim, H., Tanous, K., Gabbard, J. L., Feichtl, B., Misu, T. & Suga, C. (2018).
Design: Ecological interface design for AR applications
User interface design for AR applications has an inherently unique challenge; users must interact with not only information on the display but also environmental changes in the real world. Considering unique characteristics of human interactions in AR, this work incorporates the ecological interface design approach that makes affordances of the environment (i.e., action possibilities and consequences) salient to the user so as to best support the user’s situation awareness, decision-making, and task performance. This goal was achieved by a mapping between the environmental constraints (e.g., type, relative position, and velocity of an obstacle) and the perceptual forms of interface elements (e.g., shape, size, color, and movement of graphical elements).
Related projects/articles (click the links to see details)
Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 2093-2097). SAGE Publications. Dieter W. Jahns Student Practitioner Award. Kim, H., Isleib, J. D., & Gabbard, J. L. (2016).
Augmented Reality Pedestrian Collision Warning: An Ecological Approach to Driver Interface Design and Evaluation. Doctoral dissertation, Virginia Tech. Paul E. Torgersen Graduate Student Research Excellence Award. Kim, H. (2017).
Design: Information presentation methods in augmented reality
What could be the best way of presenting contextual information in augmented reality? In this study, we explored four different techniques (i.e., screen-fixed, screen-animated, world-fixed, and world-animated) to present visual elements of AR user interfaces and their effects on user performance and experience. Results demonstrate that animated design approaches can produce some user performance gains (e.g., in goal-directed navigation tasks) but often come at the cost of response time (e.g., in stimulus-driven collision avoidance tasks).
Sponsors/Collaborators: Honda Research Institute USA Inc.
Related projects/articles (click the links to see details)
Augmented Reality Interface Design Approaches for Goal-directed and Stimulus-driven Driving Tasks. IEEE Transactions on Visualization and Computer Graphics, 24(11), 2875- 2885., Presented at the IEEE ISMAR 2018, Best Paper Award Honorable Mention. Merenda, C., Kim, H., Tanous, K., Gabbard, J. L., Feichtl, B., Misu, T. & Suga, C. (2018).
Requirements Analysis: Contextual inquiry
Contextual inquiry is a technique to obtain information about the context of use, where users are observed and questioned while they work in their own environments. This study extended the traditional contextual inquiry technique to the user-centered design of augmented reality interfaces.
Related projects/articles (click the links to see details)
Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 1750-1754). SAGE Publications. Rane, P., Kim, H., Marcano, J. L., & Gabbard, J. L. (2016).
Methodology: Heuristic evaluation of AR applications
Rapid prototyping and evaluation is the key to the success of the user-centered design of AR applications. This study proposes a heuristic evaluation method that walks subject matter experts through user interface design alternatives. The proposed set of heuristics helps the evaluator predict user performance in perception, situation awareness, and decision-making while interacting with AR applications. The proposed method also helps the evaluator identify user interface elements that contribute to user performance by extending the retrospective think-aloud technique.
Related projects/articles (click the links to see details)
Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 2093-2097). SAGE Publications. Dieter W. Jahns Student Practitioner Award. Kim, H., Isleib, J. D., & Gabbard, J. L. (2016).
Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 1750-1754). SAGE Publications. Rane, P., Kim, H., Marcano, J. L., & Gabbard, J. L. (2016).
Prototyping: Rapid prototyping of AR head-up displays in driving simulators
In this study, we proposed a new driver vehicle interface for crash warning systems using an AR head-up display, which overlays visual cues for impending hazards with variable transparency depending on distance from the hazard. We prototyped the design ideas and conducted a user study in a low-fidelity driving simulator. Our results indicate that drivers find HUDs more effective than conventional displays
Related projects/articles (click the links to see details)
AR DriveSim: An Immersive Driving Simulator for Augmented Reality Head-up Display Research. Frontiers in Robotics and AI, 6, 98. Gabbard, J. L., Smith, M., Tanous, K., Kim, H., & Jonas, B. (2019).
Exploring head-up augmented reality interfaces for crash warning systems. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 224-227). ACM. H Kim, X Wu, JL Gabbard, NF Polys. (2013)
Prototyping: Simulated AR in immersive virtual environments
This project utilized an AR simulation approach by which the research team prototyped and tested variations of augmented reality user interfaces in a high-end controlled virtual environment. The virtual environment was implemented using an open-source virtual reality (VR) driving simulator based on the Unity 3D game engine which provides users with an immersive experience leveraging multimodal human-machine interfaces (HMI) including augmented visual cues as well as directional auditory and haptic cues via the headset, steering wheel, and seat. This platform enables rapid prototyping of various HMI options and testing their effects on driver behavior and performance.
Sponsors/Collaborators: US Department of Transportation through Safe-D University Transportation Center
Related projects/articles (click the links to see details)
Guiding Driver Responses During Manual Takeovers from Automated Vehicles (No. VTTI-00-026). Safety through Disruption (Safe-D) University Transportation Center (UTC). Greatbatch, R. L., Dunn, N. J., Kim, H., & Krasner, A. (2023).
User study: Multimodal interfaces for human-machine teaming in partially automated driving
Until fully automated vehicles become a reality, drivers must take over vehicle control from driving automation systems, even with little or no advance warning, when the automation encounters its operational boundaries. Empirical studies have shown that such control transitions are challenging for human drivers and traditional simple takeover requests may not be sufficient for the driver to deal with complex driving situations. This project aimed to explore multi-modal human-machine interfaces (HMI) approaches to improve driver situation awareness and guide appropriate driver responses in challenging takeover situations.
Sponsors/Collaborators: US Department of Transportation through Safe-D University Transportation Center
Related projects/articles (click the links to see details)
Guiding Driver Responses During Manual Takeovers from Automated Vehicles (No. VTTI-00-026). Safety through Disruption (Safe-D) University Transportation Center (UTC). Greatbatch, R. L., Dunn, N. J., Kim, H., & Krasner, A. (2023).
Human-Machine Interfaces for Handover From Automated Driving Systems: A Literature Review. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 64, No. 1, pp. 1406-1410). Greatbatch, R., Kim, H., Doerzaph, Z, & Llaneras, R. (2020).
Prototyping: Rapid prototyping of AR using immersive videos
To help with rapid prototyping and subsequent design iterations, the research team used a video editing tool to overlay computer-generated graphics atop pre-recorded driving video footage. The actual driving scenario aimed to improve our prototype’s ecologically validity. Combined with a high-fidelity driving simulator, the augmented video footage served as an appropriate tool to provide participants with an immersive and realistic environment. Participants were asked to control the steering wheel, pedals, and turn signals of a real vehicle in response to events in the video to mimic the actions required when driving a real vehicle on the road. We presented a small crosshair in the driving scene (controlled by the steering wheel) and asked participants to keep the cross-hair in the center of the lane when “driving”.
Related projects/articles (click the links to see details)
Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 1750-1754). SAGE Publications. Rane, P., Kim, H., Marcano, J. L., & Gabbard, J. L. (2016).
Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 2093-2097). SAGE Publications. Dieter W. Jahns Student Practitioner Award. Kim, H., Isleib, J. D., & Gabbard, J. L. (2016).
Assessing distraction potential of augmented reality head-up displays for vehicle drivers. Human factors, Kim, H., & Gabbard, J. L. (Online First).
User study: Virtual road signs, AR driving aids for novice drivers
Studies have shown that experts are more sensitive to changes in the road scene than novice drivers, and use the driving patterns of other cars to infer important information. A tool that can help bridge the gap between experts and novices may be augmented reality (AR), which can be used to graphically overlay virtual information onto the real world that may not otherwise be easily inferred. In this paper, we propose an AR interface that aims to improve the sensation, attention, situation awareness, and decision making of international drivers who are new to the United States (US). We present the results of a preliminary study that identifies the needs of novice international drivers as well as an AR interface design created to support these needs.
Related projects/articles (click the links to see details)
Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 60, No. 1, pp. 1750-1754). SAGE Publications. Rane, P., Kim, H., Marcano, J. L., & Gabbard, J. L. (2016). Demo video
Prototyping: Optical see-through volumetric head-up displays
The research team implemented user interface design ideas on our in-vehicle prototype of an optical see-through HUD. It is a projection-based volumetric display with a swept-volume technique using fast switching image planes within a range of focal distance between 8m and infinity (0.125D ~ 0D); affording flicker-free appearances of virtual objects in the 3D space with about 17° circular field of view. We authored testbed software in C++ and QT5 to render the AR graphics. The testbed software was designed to allow changes to the testbed parameters and AR interfaces in real-time. GPS (uBlox EVK-M8N) was used to activate AR interfaces based on the test vehicle’s geolocation and used to track and record vehicle position via GPS application.
Sponsors/Collaborators: Honda Research Institute USA Inc.
Related projects/articles (click the links to see details)
Augmented Reality Interface Design Approaches for Goal-directed and Stimulus-driven Driving Tasks. IEEE Transactions on Visualization and Computer Graphics, 24(11), 2875- 2885., Presented at the IEEE ISMAR 2018, Best Paper Award Honorable Mention. Merenda, C., Kim, H., Tanous, K., Gabbard, J. L., Feichtl, B., Misu, T. & Suga, C. (2018).
Driver Behavior and Performance with Augmented Reality Pedestrian Collision Warning: An Outdoor User Study. IEEE Transactions on Visualization and Computer Graphics, 24(4), 1515-1524., Presented at the IEEE VR 2018. Kim, H., Gabbard, J. L., Anon, A. M., & Misu, T. (2018).
User study: Outdoor usability evaluation of AR pedestrian warning
This study investigated the effects of visual warning presentation methods on human performance in augmented reality (AR) driving. An experimental user study was conducted in a parking lot where participants drove a test vehicle while braking for any cross traffic with assistance from AR visual warnings presented on a monoscopic and volumetric head-up display (HUD). Results showed that monoscopic displays can be as effective as volumetric displays for human performance in AR braking tasks. The experiment also demonstrated the benefits of conformal graphics, which are tightly integrated into the real world, such as their ability to guide drivers’ attention and their positive consequences on driver behavior and performance. These findings suggest that conformal graphics presented via monoscopic HUDs can enhance driver performance by leveraging the effectiveness of monocular depth cues. The proposed approaches and methods can be used and further developed by future researchers and practitioners to better understand driver performance in AR as well as inform usability evaluation of future automotive AR applications.
Sponsors/Collaborators: Honda Research Institute USA Inc.
Related projects/articles (click the links to see details)
Driver Behavior and Performance with Augmented Reality Pedestrian Collision Warning: An Outdoor User Study. IEEE Transactions on Visualization and Computer Graphics, 24(4), 1515-1524., Presented at the IEEE VR 2018. Kim, H., Gabbard, J. L., Anon, A. M., & Misu, T. (2018).
Look at Me: Augmented Reality Pedestrian Warning System Using an In-Vehicle Volumetric Head Up Display. In Proceedings of the 21st International Conference on Intelligent User Interfaces (pp. 294-298). ACM. Acceptance Rate=25%. Kim, H., Miranda Anon, A., Misu, T., Li, N., Tawari, A., & Fujimura, K. (2016).
Human-Machine Teaming in Intelligent Transportation Systems
As technology advances, driving becomes teamwork between the human and the machine. However, people may not use driving automation as intended. Some people may not be comfortable working with automated driving systems, while others may overtrust and misuse them. Instead of replacing people with automated technology, we pursue human-machine teaming in future intelligent transportation systems.
Methodology: Estimating driver situation awareness based on eye glance behavior and road hazard characteristics - A machine learning approach
Objective: To develop data-driven predictive models of driver situation awareness based on observable driver behavior and situational contexts. Background: Until full driving automation becomes a reality, both the human driver and the vehicle must work together and be attentive to each other’s current states and future intentions. An automated vehicle may need to understand not only environmental changes but also the driver’s awareness of road hazards for appropriate assistance or intervention. However, it is challenging to objectively assess if the driver is aware of the current situation. Method: The research team simplified the problem as a binary classification of whether the driver is aware of a specific road hazard given the driver’s eye-glance behavior as well as the characteristics of the driver and road hazard. We took a machine-learning approach to the problem by training random forest classifiers with the data collected from 50 participants in a driving simulator. Results: Driver gaze behavior was the most promising but insufficient predictor of drivers' awareness of road hazards. Including features of road hazards, such as salience against the background, helped improve the prediction performance of the models. Individual differences among drivers were weak predictors of driver awareness. The random forest classifier showed a relatively good prediction performance. Conclusion: This study shows the potential of machine-learning approaches to driver awareness estimation based on observable driver behavior and characteristics of road hazards combining computer-vision (object recognition) and eye-tracking technologies. Application: The results of the study could be further extended to human-machine interfaces adaptive to the human driver’s situation awareness.
Sponsors/Collaborators: Honda Research Institute USA Inc.
Related projects/articles (click the links to see details): WIP
Toward Real-Time Estimation of Driver Situation Awareness: An Eye-tracking Approach based on Moving Objects of Interest. In 2020 IEEE Intelligent Vehicles Symposium (IV) (pp. 1035-1041). IEEE. Kim, H., Martin, S., Tawari, A., Misu, T., & Gabbard, J. L. (2020). Presentation video
Estimating Driver Awareness of Road Hazards based on Driver Gaze Behavior and Hazard Characteristics: A Machine-Learning Approach. Presented at the IISE Annual Conference. Kim, H. (2023).
Methodology: Rapid prototyping and evaluation of crew stations to support human-machine teaming
This project aims to explore new approaches to rapidly prototype crew stations in virtual environments with varying fidelity and flexibility. For this, we are exploring approaches to the rapid prototyping and evaluation of crew stations in VR & AR to provide varying levels of fidelity: 1) low-fidelity prototypes of just digital human interfaces, 2) medium-fidelity prototypes in VR used along with physical mockups, and 3) high-fidelity prototypes using AR to overlay digital interfaces on physical medium.
Sponsors/Collaborators: US Department of Defense through Automotive Research Center
Related projects/articles: WIP
User study: External human-machine interfaces to support robotaxi picking process
The increasing popularity of automated vehicles and shared mobility services has led to the development of robotaxis as a new mode of transportation. In order to ensure a positive and safe user experience, effective human-machine interfaces (HMIs) are needed. This paper focuses on the design of external HMIs (eHMIs) for robotaxis, with a focus on using multiple modalities to enhance the user experience and improve communication between passengers and the automated system. By reviewing the current state of the art and key design considerations for eHMIs, this paper aims to contribute to the development of effective and user-friendly eHMIs for robotaxis with respect to rider-vehicle match process, passenger safety, and security. The study presents a virtual simulation-based prototype utilizing the Unreal Engine and immersive virtual reality technology. The evaluation will focus on safety, usability, and trust, with the overall effectiveness of the prototype being measured against established metrics.
Related projects/articles (click the links to see details)
WIP
User study: A test-track evaluation of Android Auto app.
Google’s "Android Auto" mobile application allows drivers to interact with Android phone applications by synchronizing with the vehicle’s primary displays using a simplified interface affording simpler, less distracting interactions. Currently, Android Auto uses a lock-out feature to restrict access following prolonged interactions. Evidence suggests, however, users may be inclined to bypass this lock-out and complete interactions with their phone, thereby eliminating Android Auto’s benefits and potentially leading to dissatisfaction with the application. The “Speed Bump” feature is intended to avoid this problem, eliminating lock-outs by pacing driver interactions. In essence “training” drivers to adopt safer glance patterns and reducing eyes-off-road times while interacting with Android Auto.
Sponsors/Collaborators: Google, Virginia Tech Transportation Institute
Related projects/articles
User study: A survey on road user acceptance and expectation of automated shuttle operation
Various driverless shuttles have been tested via pilot studies worldwide, as they have the potential to fill gaps in public transportation services. However, as most studies to date have surveyed the general public (without direct exposure) or users within the shuttle ride, we presently lack a complete understanding of how users who share the road with such vehicles perceive the new technology. To inform future deployments of such vehicles on public roadways, we operated a low-speed automated shuttle and surveyed both shuttle riders and also non-riders before and after their three months of exposure to the shuttle operation. The results suggest that even though experience with, and exposure to, the technology gathered trust and acceptance among road users, non-riders have quite different attitudes towards shuttle operations from shuttle riders. In addition, many people strongly support rules and restrictions governing shuttle operations on public roadways. Future researchers and policymakers could leverage the survey findings for more successful deployments of automated shuttles on public roadways.
Sponsors/Collaborators: Virginia Tech Transportation Institute
Related projects/articles (click the links to see details)
Road User Attitudes Toward Automated Shuttle Operation: Pre and Post-Deployment Surveys. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 66, No. 1, pp. 315-319). Kim, H. & Doerzaph (2022).
User study: Real-world use of SAE L2 automated driving systems, A naturalistic driving study
Advanced Driver Assistance Systems (ADAS) with SAE Level 2 automated capabilities have entered the vehicle marketplace. These automated driving systems (ADSs) may have the potential to fundamentally change the driving experience through automatic lateral and longitudinal vehicle control. However, people may not use ADSs as intended due to misunderstanding the systems’ capabilities and limitations. Moreover, the real-world use and effects of this novel technology on transportation safety are largely unknown. To investigate driver interactions with driving automation, we examined existing naturalistic driving data collected from 50 participants who drove personally owned vehicles with Level 2 ADSs for 12 months. We found that 47 out of 235 safety-critical events (SCEs) involved ADS use. An in-depth analysis of 47 SCEs revealed that people misused ADSs in 57% of SCEs (e.g., engaged in secondary tasks, used the systems not on highways, or with hands off the wheel). During 13% of SCEs, the systems neither reacted to the situation nor warned the driver. A post-study survey showed that people found ADSs useful and usable. However, the greater the positive attitude toward ADS features, the more participants felt comfortable engaging in secondary tasks. This is a potential unintended side effect of Level 2 ADSs given they still rely on the human driver’s supervision. This study also captured some scenarios where the ADSs did not meet driver expectations in typical driving situations, such as approaching stopped vehicles and negotiating curves. The findings may inform the development of human-machine interfaces and training programs to reduce the unintended use of ADSs and their safety consequences.
Sponsors/Collaborators: US Department of Transportation through Safe-D University Transportation Center
Related projects/articles (click the links to see details)
Is Driving Automation Used as Intended? Real-World Use of Partially Automated Driving Systems and their Safety Consequences. Transportation Research Record, 2676(1):30-37, Kim, H., Song, M. & Doerzaph, Z. (2022)
Real-World Use of Partially Automated Driving Systems and Driver Impressions. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 64, No. 1, pp. 1092-1093). Kim, H., Miao Song, & Doerzaph, Z. (2020). Presentation video