GUI prototyping for mobile devices using Augmented Reality and Haptics


Industrial Virtual Reality Institute  & Motorola Ltd. 

Project forms a platform for the research at  upcoming  UIC Innovation Center

Project involves virtual prototyping the Motorola Ming smart phone. Pictures of the phone's GUI could not be included since the product has not released in the US. Alternatively, a description of the tool that enables interaction with the same is described here.

Above, is a picture of the Sensable haptic robot used for our interactions. 

As products increasingly become display based, physical mockups made of foam or clay no longer serve the purpose in design evaluations. In particular, it is impossible to evaluate touch screen devices as the mockups are now nothing more than the casing and do not give any idea about the functionality of the product. My thesis is geared towards enabling Graphical User Interfaces on virtual prototypes whereby, designers can try out their concepts with a totally functional digital model. Since the project involves haptic intercations, the user would experience both visual and tactile feedback and experience the product months or years before it is manufactured. 

 

A demo of the application embedded on a virtual model of the Iphone.


Iphone virtual prototype from arun rakesh yoganandan on Vimeo

Ability to interactively change the color of the device by keyboard presses

"R"  for Red.

"B" for Blue.

"W"  for White.


Color change from arun rakesh yoganandan on Vimeo

 

 

 

A haptic prototype of the Motorola Q smartphone. The stylus in the picture is manipulated by the user to touch or feel the model.

A similar prototype for the Motorola Ming.



The first phase to my on going thesis is a Virtual web browser that serves for the GUI for the virtual prototype. A screen shot of the same below. The stylus/pointer takes the function of the mouse and the user gets a sensation of touching a plate in space while clicking.



This application essentially embeds the Mozilla engine into virtual space providing a great platform for a GUI.

I made use of the llmozlib libraries by Callum(from Linden lab of second life fame) and adapted it to Open Inventor (Coin 3d) . The program was written in C++ and Visual Studio 2005 was the development environment. Haptic interactions were programmed with the Open Haptics library and controls with FLTK. The project also makes use of the Sensimmer libraries for the immersive touch developed by Cristian . Hardware used was Sensable's Phantom Desktop haptic device. The above browser is then pasted on the mobile device and a set of html pages of the GUI will enable complete interaction with the digital model.




Questions this tool can answer:

  •  Is this flow/Sequence in UI pages comfortable for my customers? How else can I change it?
  •  Which of these two UIs works better for this phone design?
  • How does my UI suit my phone's body design? 
  • How should I change the alignment or dimensions for better interactions?
  • Which one is best for my phone? Touchscreen or Qwerty keyboard?