Multi-modal NUI (Nature User Interfaces) with a Large-scale AR HUD in Vehicles (HCIK 2018, KETI)


Goal

Development of multi-modal NUI (Natural User Interface) based on voice command and touch gesture to improve user experience in vehicle




Procedure

  • Preliminary online survey (158 people; rate: male 109, female 49)

- Controlling air conditioner and music player are chosen for in- vehicle infotainment

  • Prototype steering wheel by placing a touch gesture pad in the center of the wheel and on the thumb of both hands

- Gesture: Center double tapping(Voice recognizing), Center single long tapping(Selection (play, On, Off)), Both(thumb pad) swipe(choice(before, next, temperature level selection))

  • Development large area windshield head-up display for natural feedback of multi-modal NUI interface


Preliminary survey results for in-vehicle infortainment selection

Handle-based touch gesture interface

Large area windshield head-up display


Result

  • Use the motion platform and virtual driving simulation to build a realistic driving environment and evaluate usability of developed interfaces

  • In the result of measured workload using NASA-TLX, pair t-test showed that participants’ subjective workload is significantly lower for the prototype than the central console

-> We verify the usability of vehicle interfaces that support voice commands and touch gesture-based multi-modal interactions.


  • Usability data analysis showed overall usability is high