Managing Virtual Content in the Real World

Augmented reality and related tracking applications have given us the ability to place virtual items and text in the real world.  There has been a sharp increase in the number of applications that use environmentally placed virtual text, such as Geocaching apps and the Wikipedia World Browser, to name a few.  With the increasing amount of virtual data in the real world, we must be careful to manage text and content so that it does not interfere with real world tasks.  This area of research seeks to improve the safety and usability of such applications, and falls under the broader category of view management.
 
 2016   
On-demand View Expansion
 
Yano, Y., Orlosky, J., Kiyokawa, K., Takemura, H. (2016) Dynamic View Expansion for Improving Visual Search in Video See-through AR. To appear in the Proceedings of the joint International Conference on Telexistence and Eurographics Symposium on Virtual Environments. 

Techniques designed to dynamically increase field of view during search tasks.
  
Emotopet: An environment sensing virtual companion

Accepted at the first OzCHI workshop on Mixed and Augmented Reality Innovations.

A virtual pet designed to exhibit emotional responses and interact with the user's gestures and voice in 3D. 
Paper 
 Perception in Wide Field of View Displays

(Submitted, under review.)

Studies on optical and video see-through AR displays and their differences.

  
 Automated View Selection  for Interior Design

Mori, M., Orlosky, J., Kiyokawa, K., & Takemura, H. (2016, October). A Transitional AR Furniture Arrangement System with Automatic View Recommendation. In Proceedings of ISMAR 2016.Paper 
 AR Motion Prediction

Itoh, Y., Orlosky, J., Kiyokawa, K., & Klinker, G. (2016, February). Laplacian Vision: Augmenting Motion Prediction via Optical See-Through Head-Mounted Displays. In Proceedings of the 7th Augmented Human International Conference 2016 (p. 16). ACM.

With the Laplacian Vision framework, we overlay future trajectory information onto moving objects to assist humans with motion prediction.

Paper 
Best Paper

Video
 
 2015   
  Non-Invasive
Augmented Reality

Orlosky, J.Toyama, T., Kiyokawa, K., & Sonntag, D. (2015, March). Halo Content: Context-aware Viewspace Management for Non-invasive Augmented Reality. In Proceedings of the 20th International Conference on Intelligent User Interfaces (pp. 369-373). ACM.

Algorithm that moves content out of the way of other people in a conversation for non-invasive augmented reality.
IUI Conference Paper 

Video

Source (GitHub)
 
 
 
 
 
 2014      
 Wide Field of View Annotation

Kishishita, N., Kiyokawa, K.,  Kruijff, E., Orlosky, J., Mashita, T., Takemura, H. (2014, September). Analysing the effects of a wide field of view augmented reality display on search performance in divided attention tasks. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium on (pp. 177-186). IEEE.

Study on labels placed in/outside of an HMD viewing plane.
ISMAR Conference paper  
Managing Mobile Text
Orlosky, J., Kiyokawa, K., Takemura, H. Managing Mobile Text in Head Mounted Displays: Studies on Visual Preference and Text Placement. To appear in the Mobile Computing and Communications Review, Vol. 18 No. 3, (12pgs) 2014. 

Series of studies on mobile text placement in HMDs.  We also show that HMDs can provide users with better contextual awareness than phones.
MCCR Journal paper
 
 Auto-remove Distracting Content with Eye-tracking
Toyama, T., Orlosky, J., Sonntag, D., Kiyokawa, K. A natural interface for multi-focal plane head mounted displays using 3D gaze. In Proceedings of the working conference on Advanced visual interfaces, May 27-30, 2014. (pp.25-32) ACM.

Natural interaction for automatically removing or dimming text in an HMD using eye tracking and multiple focal plane discretization. 
AVI Conference Paper

AVI Presentation Part 1 
AVI Presentation Part 2
 

2013



 
Dynamic Text Management 
Orlosky, J., Kiyokawa, K., Takemura, H. Dynamic Text Management for See-through Wearable and Heads-up Display Systems. Proceedings of the 2013 International Conference on Intelligent User Interfaces. pgs. 363-370.  

 Algorithm designed to move text throughout the environment in real time in order to maximize visibility. 
IUI Conference Paper
Best Paper

MIRU 2013 Poster

Video1
Video2


 
Towards Intelligent View Management
Orlosky, J., Management and Manipulation of Text in Dynamic Mixed Reality Workspaces, 2013 International Symposium on Mixed and Augmented Reality. [Doctoral Consortium] 

Orlosky, J., Kiyokawa, K., Takemura, H. Towards Intelligent View Management: A Study of Manual Text Placement Tendencies in Mobile Environments Using Video See-through Displays, 2013 International Symposium on Mixed and Augmented Reality. [Poster]

Study on environmental text placement when given manual control.
Doctoral Consortium Paper

Video1

ISMAR Poster 

 
 HMD Information Display in the Visual Periphery
Kishishita, N., Orlosky, J., Mashita, T., Kiyokawa, K., & Takemura, H. (2013, March). Poster: Investigation on the peripheral visual field for information display with real and virtual wide field-of-view see-through HMDs. In 3D User Interfaces (3DUI), 2013 IEEE Symposium on(pp. 143-144). IEEE.

Study on annotation placement in a user's peripheral field of view using a cave system and wide field of view head mounted display.
3DUI Poster