Speakers

Keynote Speakers

Ed Swan

(Professor, Mississippi State University, United States)

Title: Learning to Measure the Perceived Location of Virtual AR Objects: A Career Retrospective

Abstract: In any use of Extended Reality (XR), an important aspect of virtual environment fidelity is being able to control the locations of virtual objects.  Where are virtual objects located?  How well can virtual objects be placed among real objects?  Can a virtual object be co-located with a real object?  Can a virtual object be located behind or beyond a real object (the x-ray vision condition)?  Asking any of these questions requires answering the question: How can we measure the perceived location of a virtual object?  In this talk, I will give a career-long retrospective of my attempts to find an answer.

Bio: Dr. J. Edward Swan II is a Professor of Computer Science and Engineering at Mississippi State University. He holds a B.S. (1989) degree in computer science from Auburn University and M.S. (1992) and Ph.D. (1997) degrees in computer science from Ohio State University, where he studied computer graphics and human-computer interaction. Before joining Mississippi State University in 2004, he spent seven years as a scientist at the Naval Research Laboratory in Washington, D.C. Dr. Swan’s research has centered on the topics of augmented and virtual reality, perception, data science, empirical methods, human-computer interaction, human factors, and visualization. Currently, he is studying the perception and technology required to give virtual objects definite spatial locations, including depth and layout perception and depth presentation methods. He is also studying efficient data science tools, and collaborating with social scientists to analyze social media data. His research has been funded by the National Science Foundation, the Department of Defense, the National Aeronautics and Space Administration, the Naval Research Laboratory, and the Office of Naval Research. Dr. Swan is a member of ACM, IEEE, and the IEEE Computer Society. He has served many roles in the technical communities of IEEE Virtual Reality (VR), IEEE International Symposium on Mixed and Augmented Reality (ISMAR), and IEEE Visualization. He is currently the chair of the IEEE VR steering committee. Previously, he served as one of the general chairs of VR 2021 and VR 2020, as well as a program chair for ISMAR 2017, ISMAR 2016, VR 2015, and VR 2014. He is also a member of the ISMAR steering committee. In 2017 and 2018, he served as Interim Department Head of Computer Science and Engineering at Mississippi State University.

Joaquim Jorge
(Professor,  University of Lisbon, Portugal)


Title: Approaches and Challenges to XR in Health Care and Rehabilitation


Abstract: The growing interest in Augmented Reality (AR) together with the renaissance of Virtual Reality (VR) has opened new approaches and techniques on how professionals interact with medical imagery, plan, train, and perform surgeries and also help people with special needs in Rehabilitation tasks. Indeed, many medical specialties already rely on 2D and 3D image data for diagnosis, surgical planning, surgical navigation, medical education, or patient-clinician communication. However, the vast majority of current medical interfaces and interaction techniques continue unchanged, while the most innovative solutions have not unleashed the full potential of VR and AR. This is probably because extending conventional workstations to accommodate VR and AR interaction paradigms is not free of challenges. Notably, VR and AR-based workstations, besides having to render complex anatomical data in interactive frame rates, must promote proper anatomical insight, boost visual memory through seamless visual collaboration between professionals, free interaction from being seated at a desk (e.g., using mouse and keyboard) to adopt nonstationary postures and freely walk within a workspace, and must also support a fluid exchange of image data and 3D models as this fosters interesting discussions to solve clinical cases. Moreover, VR and AR-based techniques must also be designed according to good human-computer interaction principles since it is well known that medical professionals can be resistant to changes in their workflow. In this course, we will survey recent approaches to healthcare, including diagnosis, surgical training, planning, and follow-up as well as AR/MR/VR tools for patient rehabilitation. I discuss challenges, techniques, and principles in applying Extended Reality in these contexts and outline opportunities for future research.



Bio: Joaquim Jorge received his Ph.D. from Rensselaer Polytechnic Institute, coordinates the GI research group at INESC-ID, and is a Full Professor of Computer Graphics at Técnico, Universidade de Lisboa. He is Editor-in-Chief of the Computers & Graphics Journal, Eurographics Fellow, ACM Distinguished Member and Speaker, and IEEE Senior Member. He served on the ACM Europe Council and was the ACM/SIGGRAPH Specialized Conferences Committee. He has organized 40+ international scientific events, is IEEE VR 2021 and 2022 Conference Co-Chair, was IEEE VR 2020 and Eurographics 2016 papers co-chair, served on 210+ program committees and has (co-)authored 300+ publications in international peer-refereed venues. His research interests include AR/VR, Medical Applications, User Interfaces, 3D Visualization. 

Rob Lindeman
(Professor/Director, HIT Lab NZ, University of Canterbury, New Zealand)

Title: Comfortable VR: Supporting long-term Use of Virtual Reality


Abstract: The use of virtual reality (VR) and related technologies by the general public has seen an explosive growth in recent years. While this emerging technology has been designed to enable people to have novel experiences, the people have also had to adapt their behaviours due to technological limitations or design decisions. This necessary behaviour change has led to some negative impacts in terms of user discomfort, primarily Fatigue, Cybersickness and general Worry. New and compelling content has led to people using VR more often and for longer-duration sessions, further increasing the prevalence of this discomfort. In this talk, I will give an overview of my current thinking and research direction around the topic of Comfortable VR.

Bio: Prof Robert W. Lindeman (Rob) is Director of the Human Interface Technology Lab NZ (HIT Lab NZ) at the University of Canterbury, and has been doing research in the field of VR since 1993. His work focuses on immersive, multi-sensory feedback systems for VR, AR, gaming, and long-term VR immersion research, specifically non-fatiguing and non-sickness-inducing experiences. He has also done significant work on the effective use of 360-degree video for science outreach, including work in Antarctica. He is an avid mountain biker, hiker, geocacher and skier.

Heide Luckosch

(Associate Professor, HIT Lab NZ, University of Canterbury, New Zealand)

Title: Games and XR

Abstract: Games and extended reality make an exciting partnership to entertain us. Who doesn’t like to escape in fantasy worlds, become a heroine and defeat enemies, race cars, or collect colourful items? The engaging and motivating characteristics of games can also be combined with the opportunities of XR to help individuals and organisations learn, train, make decisions, and explore complex or dangerous issues. That’s what we call “applied games”. In my talk, I will discuss some examples of applied games and their combination with immersive technologies, what purpose they serve, and how they have been developed and evaluated. I will discuss opportunities and limitations of games and XR, and look forward to a lively discussion!

Bio: Heide is an associate professor and head of the Applied Immersive Gmaing Initiative (AIGI) at the HIT Lab NZ, New Zealand. She is interested in the design, implementation and evaluation of applied immersive technologies and games. Heide’s work focuses on the development of games into an effective instrument for problem solving and learning, their combination with other methods and technologies, such as modeling and simulation, Augmented Reality and Virtual Reality. A special scope lies on the realism of games and its relation to meaning creation, as well as on co-design processes with diverse groups of end-users. Heide lives with her family in the south of Christchurch and enjoys the great outdoors of Aotearoa, her third home country.

Invited Speakers

Mark Billinghurst

(Professor, University of South Australia, and University of Auckland, New Zealand)

Title: Rapid Prototyping for XR Experiences

Abstract: This presentation provides an overview of tools available for rapid prototyping of AR and VR experiences. In recent years a wide variety of tools have been developed that enable people to create AR and VR prototypes with little or no coding required. Tools such as ShapesXR enable designers to prototype XR experiences from within VR, while Snap Lens Studio and Figmin XR (among others) support low code or visual programming development. The goal is to allow XR designers to rapidly reproduce the key user experience elements of their design without needing intensive programming. The presentation will provide an introduction to a range of different tools available, and also discuss future research directions in this space. 

Bio: Mark Billinghurst is Director of the Empathic Computing Laboratory, and Professor at the University of South Australia in Adelaide, Australia, and also at the University of Auckland in Auckland, New Zealand. He earned a PhD in 2002 from the University of Washington and conducts research on how virtual and real worlds can be merged, publishing over 650 papers on Augmented Reality, Virtual Reality, remote collaboration, Empathic Computing, and related topics. In 2013 he was elected as a Fellow of the Royal Society of New Zealand, and in 2019 was given the ISMAR Career Impact Award in recognition for lifetime contribution to AR research and commercialization.  In 2022 he was selected for the ACM SIGCHI Academy, for leading human-computer interaction researchers, and also selected for the IEEE VGTC VR Academy for leading VR researchers.

Stephan Lukosch

(Professor, HIT Lab NZ,

University of Canterbury, New Zealand)

Title: Evaluating immersive experiences


Abstract: The presentation will start from reviewing engagement in interactive systems. Then, it will explain different methods for evaluating aspects of immersive systems alongside recent research projects exploring the impact of immersive technology. It will continue to compare different methods for evaluating aspects of immersive systems and provide means to decide on when to use described methods for evaluation. Methods and concepts covered include the System Usability Scale (SUS), the Game Experience Questionnaire (GEQ), Situational Awareness (SART) or NASA Task Load Index (TLX). 

Bio: Stephan Lukosch is a full professor at the HIT Lab NZ of the University of Canterbury, New Zealand. His current research focuses on human augmentation to enhance our capabilities for which he combines augmented reality (AR) or virtual reality (VR) with applied games. He studied Computer Science at the Technical University of Dortmund, Germany. Before joining the University of Canterbury in 2019, he worked as Associate Professor at the Delft University of Technology in the Netherlands and as Assistant Professor at the University of Hagen in Germany. Currently, he is one of the general co-chairs of the IEEE conference on Virtual Reality and 3D User Interfaces. He further serve on the editorial board of the Springer Journal of Computer Supported Cooperative Work (CSCW), Journal of Universal Computer Science (J.UCS), the International Journal of Cooperative Information Systems (IJCIS) and Frontiers in Virtual Reality. 

Adrian Clark

(Associate Professor, School of Product Design, University of Canterbury, New Zealand)

Title: The things I wish someone had told me before I started my own AR company

Abstract: Founded in 2012, QuiverVision is a company built around IP developed at the HIT Lab NZ. In the past 10 years, QuiverVision’s products have been downloaded over 22 million times worldwide, won numerous awards, and been featured on TV, print magazines, and countless websites.


Although by some measures QuiverVision has been very successful, the journey wasn’t without its challenges. In this talk, I will present a little bit about how we took QuiverVision from an academic research product to a commercial product used by millions of people, and the various roadblocks, pivots, triumphs and defeats we faced along the way. For anyone who is interested in starting their own company in the XR industry, I will talk about the things we did well, but more importantly the things we could have done better, and some things I would do very differently if I were to do it all again.

Bio: Director of Studies for Applied Immersive Game Design under UC’s School of Product Design, Adrian completed his PhD in Computer Science at the HIT Lab NZ. During his PhD he invented the concept of “Colourable Augmented Reality”, and he later went on to cofound the company QuiverVision, where this technology was used to create entertaining and educational augmented reality mobile applications and games, which have been downloaded by tens of millions of users worldwide.

Ryan McKee

(Game Developer/Programmer, HIT Lab NZ,University of Canterbury, New Zealand)

Title: Introduction to Unity XR development

Abstract: Unity is a modular, multi-platform 3D game engine that utilizes the latest in XR technology to give developers the tools needed to create their own apps quickly and efficiently.  I will be going over some basics of Unity workflow and following up with a brief tutorial on the foundations of XR development.  We will be using Unity 2021 and the native XR packages and components to create a tracking space for the head and hands, XR controller input operation, basic teleport locomotion, and grabbing objects.  We will use the Meta Quest 2 as our HMD of choice. Time permitting, we can explore the native hand tracking features of the headset or follow up with questions or develop our XR app more.

Bio: I am originally from Virginia, USA with a background in mechanical engineering. I have lived in NZ since 2008 as a 3D graphic artist until I moved into solo game development in 2014. I shipped my first game, Vector36, on Steam in 2017. I have been working at the HITLab as a game developer since 2019 supporting student XR projects, laboratory hardware, and research.

Dr. Tham Piumsomboon
(Senior Lecturer Above the Bar, School of Product Design, University of Canterbury, New Zealand)

Title: Ex-Cit Strategies: Why and How to Disengage Users from Immersive Virtual Environments

Abstract: The future of interconnected Immersive Virtual Environments (IVEs), or the Metaverse, is expected to be highly engaging once the underlying technologies are enabled, and drawbacks addressed. Arguably, the engagement formula has already been mastered in the social and interactive media industries, so it's only a matter of time before the same occurs in IVEs. This talk presents findings from a recent study on techniques to disengage users from IVEs and transition them from VR to AR while still remaining in the immersive system, minimising the impact of break-in-presence. Eleven XR experts participated in an elicitation study, resulting in 132 visualisation and interaction techniques for four different IVE experiences: Narrative-driven, Social-platform, Adventure Sandbox, and Fast-paced Battle. The talk will showcase these techniques and explain how they can be used for strategic disengagement. The full research article can be found here.

Bio: A leader of the QUADRIC research group at the School of Product Design and a collaborator of the Empathic Computing Laboratory (ECL) and HIT Lab NZ, Tham has made contributions to the advancement of Human-Computer Interaction and XR research, specialising in Mixed Reality interaction, perception, and collaboration. As a former PhD candidate at UC researching at HIT Lab NZ and a Research Fellow at ECL, Tham's work in XR began with the design of a multimodal gesture and speech interface for AR. This work led to the opportunity to work with an early prototype sensor for the Microsoft HoloLens at Microsoft Research. His current interests encompass, but are not limited to, human-agent interaction and collaboration, XR visualisation and interaction (e.g., immersive analytics, reality-virtuality transitions), and perception enhancements (e.g., tactile).

Chris Chitty

(Research Lab, School of Built Environment, Massey University,
New Zealand)

Title: You built “What?”

Abstract: Chris hopes to show you how on earth (and sometimes why on earth) his team made some of these extraordinary things. He will discuss the different perspectives, questions and scenarios they developed which allowed them to build things that actually worked. His presentation will discuss many of the projects his company has completed over the past 40 years, including robotics, physical simulators and animatronics. Many of Chris’s projects were designed to be physically confronting , other physical simulators, were designed to develop and maintain psychomotor skills and demonstrate subtle spatial anomalies. Chris’ talk will focus on what was needed in each project to achieve cognitive engagement and suspend disbelief. He hopes to inspire others to work some magic and create their own successful projects. 

Bio: Chris Chitty is the Research Laboratory Technical Manager for the School of Built Environment at Massey University, Auckland. Over his varied and eventful career in product development, Chris has created all manner of devices, including medical simulators, interactive museum displays and hyper-realistic robotic animals for motion pictures. He has also featured as “Doctor Robotech” in four series of the television show Let’s Get Inventin’, which developed concepts into products and brought the process alive for young and old. Two the many highlights in his resume are a BAFTA nomination for Visual Effects “Babe” and 1st Prize Scientific Exhibit. International Anaesthesia Research Society “Replicant, Dexter”.


Industry Panel

Public Open House (FREE Event) please register: https://www.eventbrite.com/e/531688132517 

Adam Hutchinson

(oVRcome, NZ)

Bio: Adam is the Founder of oVRcome, a Christchurch based startup making mental health treatment more accessible using VR specifically on the smartphone. Their initial VRET Phobia programme has been used in over 30 countries, recently published their first clinical trial with the University of Otago and have raised $1.3M to fund further development. This is Adam's second digital startup, his previous startup (CamperMate) was acquired by Tourism Holdings in 2015.


Sakthi Priya Balaji Ranganathan

(JIX Reality, NZ)

Bio: Sakthi is a hands-on entrepreneur with expertise in digital design, cross-functional team development, human-computer interaction (HCI), and extended reality technology. Through creative foresight, drive, and resoluteness, he built JIX Reality as high-growth startup research focused on commercializing extended reality technologies (VR, AR, MR) for the New Zealand market. Sakthi is a ‘maker’ at heart who is not afraid to roll up his sleeves and push the limit of what’s achievable digitally. JIX Reality was born in 2017 when he noticed a void in applying XR tech research to real-world problems. As a self-sufficient innovator, Sakthi is skilled in creating a vision and leading teams from the start of the project to explore new models and implementation approaches while keeping the focus on the return on investment.

Rita Garcia

(Unity, NZ)

Bio: Dr Rita Garcia has over 15 years of experience in the Visual Effects (VFX) industry, working as a Software Engineer at Pixar, Wētā Digital, and Marvel Studios. She is currently a Senior Software Engineer at Unity and is an Adjunct Researcher at Victoria University of Wellington. She continues her research in Computer Science Education, supporting diversity in Computer Science and Software Engineering. Her industry experience motivated her research to study students' collaboration processes during group work. This work aims to help reduce language during group work that might exclude women and underrepresented minorities from pursuing careers in the Tech field.

Simon Yorke

(Innovation Lead, Aurecon)

Bio: For 20 years Simon has been pushing the boundaries to drive a change in how the AEC (Architecture, Engineering & Construction) industry interacts with design for shaping the world around us. His research and application of emerging technologies have led to the development of a range of industry-first solutions, from drones and robotics to immersive virtual and augmented reality experiences. 

 

Standouts from his team’s most recently featured work are the ‘Real-Time Remotely Operated Digger’ that safely cleared debris inside the Christ Church Cathedral, and a fully interactive VR Wheelchair system for assessing accessibility in design. Other key immersive experiences include a VR Bike for Cycleway design, VR Kayak and Paddleboard for public engagement in riverway regeneration, Train Operator simulator for rail design, and numerous AR tools for geologists and engineers to complete their jobs in the field. 

 

As a thought leader across a range of fields with a proven track record of deployed solutions, Simon has been a sought-after speaker at events globally to share the innovations developed and delivered here in New Zealand. From opening the inaugural ARVR Dubai Conference and IMArch (Immersive Architecture) Singapore event, to a decade of conferences in the USA for companies like Autodesk and Hexagon.

 

Simon also guest lectures for Civil Engineering, Geography, and Architectural Engineering courses at the University of Canterbury, educating students on the impact technology is having on the industry and the future of work, preparing them for a rapidly evolving world. 


James Hayes

Bio:  A New Zealand Innovator of the Year semi-finalist in 2021 and winner of the Microsoft Supreme Prize for developing and implementing innovative, world-first, Virtual Reality training in 2020 James Hayes is the Founder of Virtual Medical Coaching, and he is reinventing the way healthcare students learn their skills. During his time as a lecturer, James could see a more immersive way for students to learn. His teaching experience, combined with his prior clinical roles, led him to create the world’s first Edtech company which specializes in Virtual Reality simulation software, Big Data, artificial intelligence, and adaptive learning. The technology combines VR, big data analytics, and artificial intelligence (AI) to allow students to learn complex or dangerous tasks in a safe, immersive and realistic environment. James is proud to be bringing together New Zealand’s highest level of developers to pioneer this world-class, student-centric healthcare education platform.

Stuart Ralston

(Trimble, NZ)

Bio:  Stuart has been hooked on VR ever since completing his Masters degree for Science Alive! in the late 1990's. He has been working with global corporation Trimble for over 20 years, busily researching emerging new technologies, and repurposing them to follow Trimble's mission statement of "Transforming the way the world works". He has been building XR products for the AEC market for over 7 years now at Trimble. He wrote the early patents on outdoor AR. He launched the first commercial MR SketchUp Viewer application in collaboration with Microsoft on the Hololens platform in 2016.  He launched the high-accuracy outdoor AR system SiteVision in 2019. Currently is developing Trimble Virtual World for Immersive Training and Construction Sequencing. He also has on-going collaborations at UC with student projects and emerging startups across multiple departments each year.