VARECo 2019

Submission Details

Submission Format

Paper submissions can be submitted in German or English language (English preferred) and should not exceed 6 pages using the Long Paper format of MuC 2019, which you can download here. It is mandatory that at least one of the authors of an accepted paper is present at the workshop during MuC 2018 to present the paper. For submission, please use the ConfTool selecting the specific workshop track. All submissions will be peer-reviewed and once accepted, the paper is published together with the MuC conference proceedings.

Deadlines

Submission Deadline: 05 June 2019 21 June 2019 (submission through ConfTool)

  • Notification: 26 June 2019 05 July 2019
  • Camera-Ready: 12 July 2019
  • Conference Early Registration: 19 July 2019

Submission Website

All paper submissions to the workshop are made through the ConfTool.

General Information

The VARECo Workshop will take place in conjunction with the German GI conference Mensch und Computer (MuC) in Hamburg from September 8 to September 11, 2019.

Program

Time: Sunday, September 8, 2019, 9:00 - 18:00

Room: University of Hamburg, Hauptgebäude, Hörsaal K


Talk format (20 + 10): Plan for 20 min slide-based presentation + 10 min questions and discussion

Schedule

09:00 – 09:10 Welcome by the Organizers

09:10 – 10:40 Session I – Interaction Research

FunctionalWorkspace for One-Handed Tap and Swipe Microgestures [1]

Bastian Dewitz, Frank Steinicke, Christian Geiger

VIGITIA: Unterstützung von alltäglichen Tätigkeiten an Tischen durch Projected AR [2]

Raphael Wimmer, Florian Echtler

Macht Teleportieren faul? Strategien zur Steigerung der natürlichen Fortbewegung in VR [3]

Timo Mantei, Eike Langbehn

10:40 – 11:00 Coffee Break

11:00 – 12:30 Session II – Interaction Research

User acceptance of augmented reality glasses in comparison to other interaction methods for controlling a hand exoskeleton [4]

Tobias Ableitner, Surjo Soekadar, Andreas Schilling, Christophe Strobbe, Gottfried Zimmermann

Evaluierung der sozialen Akzeptanz verschiedener Interaktionsarten für Augmented-Reality-Datenbrillen [5]

Nils Adrian Mack, Ludger Schmidt

The AR-Marker in the Urban Space [6]

Simon Nestler, Sebastian Pranz, Klaus Neuburg

12:30 – 14:00 Lunch Break

14:00 – 15:30 Session III – Applications & Collaboration

Supporting Musical Practice Sessions Through HMDBased Augmented Reality [7]

Karola Marky, Andreas Weiß, Thomas Kosch

Adjusting AR-Workflows of Care Tasks: Experiences from an initial study [8]

Marc Janssen, Michael Prilla

Overview of Collaborative Virtual Environments using Augmented Reality [9]

Nico Feld, Benjamin Weyers

15:30 – 16:00 Coffee Break

16:00 – 17:00 Session IV – Collaboration & Interactive Visualization

Software Engineering for AR-Systems considering User-Centered Design Approaches [10]

Thomas Schweiß, Lisa Thomaschewski, Annette Kluge, Benjamin Weyers

Designing an Interactive Visualization for Coordinating Road Construction Sites in Virtual Reality [11]

Manuela Uhr, Sina Haselmann, Lea Steep, Joschka Eikhoff, Frank Steinicke


17:00 - 18:00 Session V - Discussion

Open for discussions!


Abstracts

Subsequent to the works, we will provide the slides of the presentations here!

[1] FunctionalWorkspace for One-Handed Tap and Swipe Microgestures

Single-hand microgestures are a promising interaction concept for ubiquitous and mobile interaction. Due to the technical difficulty of accurately tracking small movements of fingers that are exploited in this type of interface, most research in this field is currently aimed at providing a good foundation for the future application in the real word. One interaction concept of microgestures is one-handed tap and swipe interaction that resembles one-handed interaction with handheld devices like smartphones. In this paper, we present a small study that explores the possible functional workspace of one-handed interaction which describes the area on the palmar surface where tap- and swipe-interaction is possible. Additionally to thumb-to-finger interaction which has been investigated more often, we also considered other fingers. The results show, that thumb interaction with index, ring and middle finger is the most appropriate form of input but other input combinations are under circumstances worth consideration. However, there is a high deviation on which locations can be reached depending on the individual hand anatomy.

[2] VIGITIA: Unterstützung von alltäglichen Tätigkeiten an Tischen durch Projected AR

Im BMBF-Projekt VIGITIA wollen wir herausfinden, wie projizierte AR-Inhalte physische Aktionen und Interaktionen an Tischen unterstützen und erweitern können. Dazu untersuchen wir, wie Tische im Alltag und in kreativen Domänen genutzt werden. Darauf aufbauend entwickeln wir Interaktionstechniken und digitale Werkzeuge zur Unterstützung dieser Aktivitäten. Insbesondere untersuchen wir, wie persönliche digitale Geräte integriert werden können, und wie mehrere voneinander entfernte Tischoberflächen generisch virtuell verbunden werden können. Ein besonderes Augenmerk gilt auch der Entwicklung von alltagstauglichen technischen Lösungen zur Projektion von Inhalten und zur kamerabasierten Objekterkennung. Dieses Positionspapier stellt unsere Motivationen, Ziele und Methoden vor. Ein Szenario illustriert die angestrebten Nutzungsmöglichkeiten.

[3] Macht Teleportieren faul? Strategien zur Steigerung der natürlichen Fortbewegung in VR

Teleportation ist eine der beliebtesten Fortbewegungstechniken in Virtual Reality (VR), denn sie ist einfach anzuwenden, effizient und induziert kaum Cyber Sickness. Oft steht VR Benutzerinnen jedoch auch ein kleiner Bereich in der realen Welt zur Verfügung, indem sie sich durch natürliches Gehen in der virtuellen Welt bewegen können. Frühere Forschungsarbeiten haben gezeigt, dass reales Gehen das Präsenzgefühl steigert und die räumliche Orientierung verbessert. Es gibt jedoch Hinweise, dass Benutzerinnen mit der Zeit bequem werden, das natürliche Bewegen einstellen und nur noch Teleportation nutzen. Dadurch wird das Potenzial von VR nicht mehr voll ausgenutzt. In dieser Arbeit beschäftigen wir uns mit Strategien, die die Bereitschaft zur natürlichen Fortbewegung steigern und die Nutzung von Teleportation verringern. In einer Benutzerstudie haben wir drei verschiedene Strategien mit der herkömmlichen Teleportation verglichen. Der Effekt der zunehmenden Bequemlichkeit von Teleportationsnutzerinnen konnte bestätigt werden. Außerdem zeigen unsere Ergebnisse, dass mit den getesteten Strategien signifikant weniger teleportiert und mehr gelaufen wurde.

[4] User acceptance of augmented reality glasses in comparison to other interaction methods for controlling a hand exoskeleton

Every year, several hundred thousand people suffer a stroke often leading to long-term motor disabilities that impair their quality of life. In this context, hemiplegia including paralysis of hand and fingers plays a key role, leaving stroke survivors unable to perform tasks requiring both hands. In case of lesions at the level of the brain stem or the spinal cord, paralysis can also affect both sides resulting in very severe constraints for performing most activities of daily living. A neural-guided hand exoskeleton can restore motor hand function after a stroke or spinal cord injury. However, controlling such hand exoskeleton raises several challenges related to human-machine interaction. While it should be operated without the user's hands and require as little physical and cognitive strain on them as possible, it should be also as inconspicuous as possible to avoid stigmatization of the users. To tackle these challenges, we conducted a survey among 62 healthy test persons to shed more light on the aspects of user acceptance regarding 12 input and 14 output methods, as well as 3 different application contexts. We found that there are differences in user acceptance for the various input and output methods between public contexts on the one hand and home and rehabilitation contexts on the other. In general, inconspicuous, handy and widely used devices are preferred in public. Also, we found that spectacle wearers are slightly more open to using AR glasses than non-spectacle wearers.

[5] Evaluierung der sozialen Akzeptanz verschiedener Interaktionsarten für Augmented-Reality-Datenbrillen

In addition to the practical acceptance, the social acceptance plays a major role in the user acceptance of a system. Although a system has a high practical acceptance, it may not be used because its social acceptance is insufficient. For Augmented Reality Glasses (AR-Glasses), different factors which influence their social acceptance have been identified to prevent a rejection by users. An important factor for the social acceptance of AR-Glasses is the type of interaction. The use of different interaction types in the same social context with the same system may not result in the same social acceptance. In this talk, the comparisons of six interaction types for AR-Glasses are presented considering the location and spectator of usage. The results of ten participants show lower social acceptance for all interaction types with increasing social distance to the spectators, especially for all types of speech input. Furthermore, the location of usage has to be taken into account when choosing an interaction type to archive higher social acceptance.

[6] The AR-Marker in the Urban Space

When considering the role of Augmented Reality (AR) in the urban space, most previous work is focusing on touristic and everyday life use cases. However, the project “Archäologie der Gegenwart” which we present in this paper illustrates the different aspects of change in Hamm during the last 50 years. Thus, our AR approach opens up a deeper understanding of the urban cultural change processes by the means of AR. Our considerations lead to adding an AR layer as a fifth social dimension in the urban space. Technically, we robustly link this fifth layer with the existing topography by marker-based tracking with six degrees of freedom (6 DOF). When building AR applications for the urban space, the deeper understanding of the marker paradigm is crucial: During our workshops we identified and analyzed seven requirements for the utilization of markers in the public urban space. Additionally, we analyzed the general AR marker paradigm from the human-computer interaction (HCI) perspective by considering the affordances and signifiers of the marker objects themselves, analyzing the tracking technology and summarizing the marker’s role for past, present and future AR applications. Thus, the role of the AR marker is twofold: On the one hand the marker is part of the 6 DOF tracking technology, on the other hand it makes AR layers perceivable in the urban space. We expect that the importance of these markings for guiding citizens through AR experiences will emerge in urban spaces, whereas the role of markers for technical tracking purposes will decrease.

[7] Supporting Musical Practice Sessions Through HMDBased Augmented Reality

Learning a musical instrument requires a lot of practice, which ideally, should be done every day. During practice sessions, students are on their own in the overwhelming majority of the time, but access to experts that support students "just-in-time" is limited. Therefore, students commonly do not receive any feedback during their practice sessions. Adequate feedback, especially for beginners, is highly important for three particular reasons: (1) preventing the acquirement of wrong motions, (2) avoiding frustration due to a steep learning curve, and (3) potential health problems that arise from straining muscles or joints harmfully. In this paper, we envision the usage of head-mounted displays as assistance modality to support musical instrument learning. We propose a modular concept for several assistance modes to help students during their practice sessions. Finally, we discuss hardware requirements and implementations to realize the proposed concepts.

[8] Adjusting AR-Workflows of Care Tasks: Experiences from an initial study

Professional caregivers need to adhere to standards when treating their patients in order to ensure a certain level of quality and hygiene. Whenever standards are refined or changed, caregivers must keep pace with them. However, these standards are interpreted differently by care providers and also offer degrees of freedom which enable caregivers to adapt them in certain situations and according to their own experience and practice. Workflows are a useful tool to define, share and execute standards correctly. In this paper we investigate the possibility to adjust workflows with our Care Lenses, an Augmented Reality based tool, which can be used by caregivers during the execution of care tasks and which supports them with guidance regarding standards. We show how care practice influences the development of technical support for workflows and what kind of advantages the possibility of adjustments grants to workflows and the integration into practice.

[9] Overview of Collaborative Virtual Environments using Augmented Reality

Using a collaborative virtual environment (CVE) can reduce the barriers of remote communication, which makes CVEs being increasingly used to support collaborative work between spatially dispersed collaborators. Augmented reality (AR) builds a bridge between working in real and virtual environment, which makes AR a candidate technology to implement CVEs. To structure this research field and to identify possible lacks of research, this paper proposes a design space of CVEs using AR. Therefore, we mapped described solutions in research literature onto the CVE's definition, which considers user-roles as well as benefits gained by AR. Additionally, we concentrate on the consistency of using specific keywords in the field. By means of the gained design space, we identified certain gaps in research and present potential next research topics in the field.

[10] Software Engineering for AR-Systems considering User-Centered Design Approaches

Technologies like augmented reality have the potential to sup-port teams in their everyday working environment. In this pa-per, we present a user centered design approach for defining requirements based on a taxonomy for augmented reality sys-tems. Therefore, we first go into detail of the taxonomy. After-wards we present requirements engineering based on infor-mation about the context, user, and task of an AR system. Ac-cording to this information, we will gather new requirements by inducing them into the taxonomy und describe how they can be used in a user centered design process. Finally, we will present a use case based on a water treatment simulation and map the previously derived requirements to the system. Addi-tionally, we will describe two user studies to evaluate an ambi-ent awareness tool, generated due to those requirements. Our work shows, that the gathered requirements can be used in an early stage of the user centered design as well as after the stage of usability testing to serve as comparative variables for further usability analysis. Additionally, in terms of the user centered design process, we developed a first prototype of the ambient awareness tool, which will be evaluated in future work.

[11] Designing an Interactive Visualization for Road Works Coordination using Virtual Reality

Road works highly affect traffic in major cities, therefore coordination is key to avoid congestion on urban streets and highways. Software tools and interactive visualizations giving insight on the complex road works data as well as preexisting spatial and temporal dependencies in between are important for the coordination process. In given 2D visualizations, spatio-temporal dependencies are shown in multiple views, resulting in high cognitive load. In this article we describe the design and evaluation of a visualization using Virtual Reality for exploring multi-dimensional data of road works. The relevance for expert use was reviewed in an interview with local traffic engineers. In addition, a user study was conducted to evaluate the general usability of the prototype. The results depict an overall positive response and acceptance and show directions for further development.

Committees

Organizing Committee

Benjamin Weyers , University of Trier

Daniel Zielasko, University of Trier

Alexander Kulik, Bauhaus-Universität Weimar

Eike Langbehn, University of Hamburg

Markus Funk, Nuance Communications

Program Committee

Christian Mai, LMU Munich

Jan Gugenheimer, University Ulm

Thomas Kosch, LMU Munich

Alexander Ohlei, University of Lübeck

Daniel Roth, University Würzburg

Patric Schmitz, RWTH Aachen University

Tim Weißker, Universität Weimar