Speakers

Keynote Speakers

Tony Parisi

(Unity, USA)

Title: Our Spatial Computing Future


Abstract: For decades, technologists and visionaries have dreamt about a future dominated by spatial computing: immersive 3D interfaces to the world’s information that connect, entertain, teach and inform us. After several fits and starts, this future is here, powered by advances in computing power, networking and artificial intelligence. To kick off the Workshop, Industry veteran and Unity innovation lead Tony Parisi will offer his point of view on spatial computing past, present and near future.


Bio: Tony Parisi is a virtual reality pioneer, serial entrepreneur and angel investor. He is the co-creator of 3D graphics standards, including VRML, X3D and glTF, the new file format standard for 3D web and mobile applications. Tony is the author of O’Reilly Media’s books on Virtual Reality and WebGL: Learning Virtual Reality (2015), Programming 3D Applications in HTML5 and WebGL (2014), and WebGL Up and Running (2012). Tony has become one of the leading spokespeople for the immersive industry, speaking on industry trends and technology innovations in virtual and augmented reality at numerous industry conferences. He was recently named in Next Reality’s 30 People to Watch in Augmented Reality in 2020. Tony is currently Head of AR/VR Ad Innovation at Unity, where he oversees the company’s strategy for virtual and augmented reality brand advertising and monetization.



Dr Pedro Lopes
(University of Chicago, USA)

Title: Integrating Interactive Devices with the Body


Abstract: When we look back to the early days of computing, user and device were distant, often located in separate rooms. Then, in the ’70s, personal computers “moved in” with users. In the ’90s, mobile devices moved computing into users’ pockets. More recently, wearable devices brought computing into constant physical contact with the user’s skin. These transitions proved useful: moving closer to users allowed interactive devices to sense more of their user and act more personal. The main question that drives my research is: what is the next interface paradigm that supersedes wearable devices?

The primary way researchers have been investigating this is by asking where future interactive devices will be located with respect to the user’s body. Many posit that the next generation of interfaces will be implanted inside the user’s body. However, I argue that their location with respect to the user’s body is not the primary factor; in fact, implanted devices are already happening in that we have pacemakers, insulin pumps, etc. Instead, I argue that the key factor is how will devices integrate with the user’s biological senses and actuators.

This body-device integration allows us to engineer interactive devices that intentionally borrow parts of the body for input and output, rather than adding more technology to the body. For example, one such type of body-integrated devices, which I have advanced during my PhD, are interactive systems based on electrical muscle stimulation; these are able to move their user’s muscles using computer-controlled electrical impulses, achieving the functionality of exoskeletons without the bulky motors. Their smaller size, a consequence of this integration with the user’s body, enabled haptic feedback in scenarios previously not possible with existing devices.

In my research group, we engineer interactive devices that integrate directly with the user’s body. We believe that these types of devices are the natural succession to wearable interfaces and allow us to investigate how interfaces will connect to our bodies in a more direct and personal way.


Bio: Pedro Lopes is an Assistant Professor in Computer Science at the University of Chicago, where he leads the Human Computer Integration lab. Pedro focuses on integrating computer interfaces with the human body—exploring the interface paradigm that supersedes wearable computing. Some of these new integrated-devices include: a device based on muscle stimulation that allows users to manipulate tools they have never seen before or that accelerate their reaction time, or a device that leverages the sense of smell to create an illusion of temperature. More: https://lab.plopes.org).


A/Prof Gina Grimshaw
(Victoria University of Wellington, NZ)

Title: Feeling Virtual: Manipulating and Measuring Emotional States in VR

Abstract: Immersive virtual reality can be used to create environments and interactions that produce authentic emotional experiences. In this talk, I will describe the many interactions between psychological science and XR technologies that are furthering our understanding of human emotion, and driving emotional applications in creative, therapeutic, and educational settings. Drawing on prominent theories of emotion, I will address three questions: 1) What are emotions? 2) How can we induce them in virtual environments? and 3) How can we identify and quantify emotional responses?

Bio: A/Prof Gina Grimshaw leads the Cognitive and Affective Neuroscience Lab (CANLab) in the School of Psychology at Victoria University of Wellington. The team uses a range of behavioural and neuroscientific methods to study interactions between emotion and cognition, and virtual reality to study these interactions in “real-world” situations. This work is still in its infancy, but is already changing how we think about the mind. Our research has been funded by the Royal Society of New Zealand Marsden Fund, the Department of Conservation, and the Neurological Foundation of New Zealand. Gina also teaches courses in the Psychology of Emotion and Affective Neuroscience, Cognitive Psychology, and Experimental Research Methods She is the Editor of Laterality: Asymmetries of Brain, Behaviour, and Cognition, and Associate Editor for Royal Society Open Science.

Professor Joaquim Jorge
(University of Lisbon, Portugal)

Title: Approaches and Challenges to XR in Health Care and Rehabilitation


Abstract: The growing interest in Augmented Reality (AR) together with the renaissance of Virtual Reality (VR) has opened new approaches and techniques on how professionals interact with medical imagery, plan, train, and perform surgeries and also help people with special needs in Rehabilitation tasks. Indeed, many medical specialties already rely on 2D and 3D image data for diagnosis, surgical planning, surgical navigation, medical education, or patient-clinician communication. However, the vast majority of current medical interfaces and interaction techniques continue unchanged, while the most innovative solutions have not unleashed the full potential of VR and AR. This is probably because extending conventional workstations to accommodate VR and AR interaction paradigms is not free of challenges. Notably, VR and AR-based workstations, besides having to render complex anatomical data in interactive frame rates, must promote proper anatomical insight, boost visual memory through seamless visual collaboration between professionals, free interaction from being seated at a desk (e.g., using mouse and keyboard) to adopt nonstationary postures and freely walk within a workspace, and must also support a fluid exchange of image data and 3D models as this fosters interesting discussions to solve clinical cases. Moreover, VR and AR-based techniques must also be designed according to good human-computer interaction principles since it is well known that medical professionals can be resistant to changes in their workflow. In this course, we will survey recent approaches to healthcare, including diagnosis, surgical training, planning, and follow-up as well as AR/MR/VR tools for patient rehabilitation. I discuss challenges, techniques, and principles in applying Extended Reality in these contexts and outline opportunities for future research.


Bio: Joaquim Jorge received his Ph.D. from Rensselaer Polytechnic Institute, coordinates the GI research group at INESC-ID, and is a Full Professor of Computer Graphics at Técnico, Universidade de Lisboa. He is Editor-in-Chief of the Computers & Graphics Journal, Eurographics Fellow, ACM Distinguished Member and Speaker, and IEEE Senior Member. He served on the ACM Europe Council and was the ACM/SIGGRAPH Specialized Conferences Committee. He has organized 40+ international scientific events, is IEEE VR 2021 and 2022 Conference Co-Chair, was IEEE VR 2020 and Eurographics 2016 papers co-chair, served on 210+ program committees and has (co-)authored 300+ publications in international peer-refereed venues. His research interests include AR/VR, Medical Applications, User Interfaces, 3D Visualization.

Dr Hrvoje Benko
(Facebook Reality Labs Research, USA)

Title: The Future of Mixed Reality Interactions


Abstract: The vision of always-on Mixed Reality interfaces that can be used in a continuous fashion for an entire day, depends on solving many difficult problems including display technology, comfort, computing power, batteries, localization, tracking, and spatial understanding. However, solving all those will not bring us to a truly useful experience unless we also solve the fundamental problem of how to effectively interact in Mixed Reality. I believe that the solution to the MR interaction problem requires that we combine the approaches from interaction design, perceptual science, and machine learning, to yield truly novel and effective MR input and interactions. Such interfaces will need to be adaptive to the user context, believable, and computational in nature. We are at the exciting point in the technology development curve where there are still few universally accepted standards for MR input, which leaves a ton of opportunities for both researchers and practitioners.

Bio: Hrvoje Benko is a Director of Research Science at Facebook Reality Labs Research, working on novel interactions, devices and interfaces for Augmented and Virtual Reality applications. He currently leads a multi-disciplinary organization that includes scientists and engineers with expertise in HCI, computer vision, machine learning, AI, neuroscience, robotics and cognitive psychology. His interests span AR/VR, haptics, new input form factors and devices, as well as touch and freehand gestural input. He is a world-renowned expert in Human-Computer Interaction (HCI). He has coauthored more than 60 scientific articles, 50 issued patents, and has served as the general chair (2014) and the program chair (2012) of the ACM User Interface Systems and Technology conference, the premiere technical conference in HCI. He is also an Associate Editor for the TOCHI Journal, the premiere journal in the HCI field. Prior to his current role, he was at Microsoft Research, where he had the privilege of working on several projects that were released as Microsoft products or as open-sourced projects, including Microsoft Touch Mouse, Windows Touch Visualizations, Microsoft Surface, and RoomAlive Toolkit. He received his Ph.D. in Computer Science from Columbia University in 2007 and his work has been featured in the mainstream media (including The New York Times and Forbes) and on popular technology blogs.

Sir Ian Taylor
(Animation Research Limited, NZ)

Bio: In 2010 Ian was awarded North & South Magazine’s New Zealander of the Year Award. This is how the editorial announced the award back then.

“In a year of recession when at times there seemed little to celebrate, North & South magazine pays tribute to an inspirational innovator who has put Dunedin on the map – and never stopped believing that Kiwis can take on the world. In 2008, just 18 months ago, Ian Taylor was on the verge of bankruptcy. A world leader in high-tech computer graphics for two decades, he arrived at his Dunedin office to tell his team that the company would have to close. As he walked past his receptionist she showed him the front page of that morning’s newspaper announcing the closure of a local factory with the loss of hundreds of jobs. “I can still remember her saying, ‘God, how awful would that be,’” says Taylor. “And I thought, ‘Well, I can’t do it today.’” In North & South’s January 2010 issue we profile a man whose maverick style and ability to inspire the passion of those around him saw him claw back from the brink to rebuild his company exporting Kiwi ingenuity to the world.”

Over a decade on from facing down that challenge Ian was named 2019 Kiwi Bank New Zealand Innovator of the Year.

Invited Speakers

Prasanth Sasikumar
(University of Auckland, NZ)

Title: Introduction to Unity


Abstract: This talk will cover an introduction to Unity and how to get started. Topics include: installation, editor layout, basic concepts and game objects, positioning/rotation/scaling of objects, introduction to scripting and creating a sample scene.


Bio: Prasanth Sasikumar is a PhD candidate with particular interests in Multimodal input in Remote Collaboration and scene reconstruction. He received his Master’s degree in HCI at the University of Canterbury in 2017. For his Masters Thesis he has worked on incorporating wearable and non wearable Haptic devices in VR sponsored by MBIE as part of NZ/Korea Human-Digital Content Interaction for Immersive 4D Home Entertainment project. Prasanth has a keen interest in VR and AR applications and how they may assist industry to better solve problems. Currently, he is doing his PhD research under the supervision of Prof. Mark Billinghurst and Dr. Huidong Bai.


Dr Roy Davies
(University of Auckland, NZ)

Title: Developing AR and VR for clients – more than just programming.


Abstract: At Imersia, we have worked with many clients over the years. One of the core aspects we have learnt is that that AR/VR programming is only a small part of the whole process. In this talk, I will cover the entire journey from the point when you approach a potential client to the point when you have to provide ongoing support, and all the steps in between. As a consultant, as part of a larger team, or even in research projects, you will have to understand this delicate dance between stakeholders.


Bio: When Dr. Roy Davies started with VR in the late ‘90s, very few people knew what it was. However, the University of Lund in Sweden was determined to pioneer the use of this promising emerging technology in a variety of human-factors related applications such as brain injury rehabilitation, and participatory design; evolving one of the first Mixed Reality systems ever built. This culminated in Scandinavia’s then largest VR Research Centre with several large-scale ‘CAVE’ systems. After returning to NZ, Roy has been a consultant, built a couple of Mixed Reality companies, advised many students and businesses, and developed a unique Mixed Reality, agent-based platform to explore alternatives to the ‘desktop metaphor’. Most recently, Roy has started at the University of Auckland and is looking forward to helping break new research ground.


Dr Simon McCallum
(Victoria University of Wellington, NZ)

Title: When will VR games become mainstream?


Abstract: This presentation will talk about games in VR and some of the inflection points we are about to experience. There have been limitations on VR experiences which have prevented games becoming mainstream. We will discuss the current state of VR games, some of the limitations, and when the social-technical environment might change.


Bio: Dr McCallum has been lecturing for over 20 year with a focus on Game Development, XR and AI. He has taught from within VR both within Norway and from NZ to Norway. He has won awards for teaching innovation and uses various technology as part of his teaching. He created the first University level game course in New Zealand in 2004 and taught a Bachelor and Masters degree in Game Development at NTNU in Norway.





Clovis McEvoy
(Freelance Composer, Germany)

Title: Sound and Music in XR: Current approaches and future possibilities


Abstract: Sound and music in extended reality applications perform many of the same functions they have served in film and 2D-gaming. Yet the distinctive properties of XR also opens the possibility to craft new forms of musical engagement, education and expression. This talk presents XR from a ‘music first’ perspective: examining some of the prevailing approaches, developing trends, and my own experiences as a composer working in this medium.


Bio: Clovis McEvoy is an award-winning composer, sound and visual artist, and researcher currently based in London, England. Clovis’ creative practice is currently focused on embodied experiential works for virtual reality, realised through multi-sensory interactive installations and performances. His works have been presented in over ten countries, including America, France, Germany, Italy, England and South Korea. Throughout 2020, Clovis was artist in residence at the TAKT Institute in Berlin, Germany and at DME-Seia in Portugal. His current project is a large-scale commission from Creative New Zealand to create an interactive oral history of the country’s present and recent past – allowing stories and music to intertwine though the medium of VR.


Dr Stefan Marks
(Auckland University of Technology, NZ)

Title: Deep Dive – Finding the Story in the Data Using Immersive 3D Visualisation

Abstract: With the renaissance of virtual reality technology in 2012, scientific visualisation of complex spatial datasets can now be achieved with consumer-level hardware at a fidelity that was previously reserved for specialised and expensive CAVE facilities. 3D immersive visualisation enables the user to “dive” into data and opens up opportunities of seeing and observing patterns, connections, spatial and temporal relations that are difficult to observe on a 2D medium. Selected examples of scientific and educational visualisations implemented at Sentience Lab, Auckland University of Technology, will be presented, including the integration of the research and the VR facilities into under- and post-graduate teaching and interdisciplinary projects.

Bio: Dr. Stefan Marks is researcher and senior lecturer of Creative Technologies in the School of Future Environments at Auckland University of Technology. His main areas of research are collaborative extended reality (XR) and data visualisation. He combines these interests in his function as director of Sentience Lab, a facility for the development of immersive, multimodal and multisensory data visualisation and interaction.

Stefan has eight years of industry experience as a hardware and software developer, a Diplom of Microinformatics and a Master’s Degree in Human-Computer Interaction from the Westfälische Hochschule, Germany and a PhD from The University of Auckland. He is an HEA fellow, and teaches Programming, Electronics, Computer Graphics, and Extended Reality.

Dr Ruggiero Lovreglio
(Massey University, NZ)

Title: Virtual and Augmented Reality Application for Human Behaviour in Disasters


Abstract: Natural and human-made disasters, such as earthquakes, fires, terrorism attacks, constantly threaten humans and their built environments resulting in loss of human lives and damage of properties. To reduce the impact of these disasters, it is fundamental to design buildings to allow people to respond safely and to train people on the best response to have depending on the nature of the disasters. Today several new technologies have been proposed to achieve these goals. Augmented and Virtual Reality represent some of the most popular technologies that have been adopted to achieve these goals. This work provides a review of existing applications and identifies common trends and research gaps.


Bio: Dr Ruggiero Lovreglio (known as Rino) is the SBE Associate Director of Research and Senior Lecturer at Massey University (New Zealand) where he teaches Digital Construction (BIM, VR, AR and 3D scanning). He got his PhD in 2016 from the Scuola Interpolitecnica (Politecnico di Bari, Milano e Torino) in Civil Engineering. His expertise is on Human Behaviour in Disasters, evacuation modelling/simulation, and pedestrian dynamics. To date, he has been investigating human behaviour in several disasters such as building fires, earthquakes, and wildfires. His research uses new technologies such as Virtual and Augmented Reality to investigate behaviours and training people. Dr Lovreglio has published more than 40 journal articles and his the winner of the 2020 Massey Research Medal - Early Career. He is an Associate Editor for Safety Science (Elsevier) and a member of the editorial board of Fire Technology (Springer) and Fire Safety Journal (Elsevier).

Dr Nadia Pantidi
(Victoria University of Wellington, NZ)

Title: User Experiences in XR


Abstract: A recent survey proposed that by 2025 "Extended Reality (XR) will be as ubiquitous as mobile devices in the consumer market”. This trend suggests that immersive technologies are already becoming increasingly embedded in our everyday lives and highlights the need for XR researchers to intensify investigations that go beyond technical challenges, towards the experience of users. This talk will address current approaches to understanding XR users and their contexts of use and highlight open questions around designing XR experiences, such as: What are the pertinent considerations in XR interaction? Are our current methods adequate for the design and evaluation of XR technologies?


Bio: Dr. Nadia Pantidi (BA, MEng, PhD) is a Lecturer at the Computational Media Innovation Centre (CMIC), Victoria University of Wellington. Her research interests are in the areas of Human Computer Interaction (HCI) and User Experience (UX), with a focus on understanding, evaluating and designing for people’s real world experiences with using technologies. Her research leverages participatory and experience-centered design approaches aimed at enhancing community resilience and engagement. Her work is regularly published in high impact international conferences and journals in the field of HCI, such as the Association for Computing Machinery (ACM) Conferences on Human Factors in Computing Systems (CHI), Designing Interactive Systems (DIS), Communities and Technologies (C&T), Computer Supported Cooperative Work (CSCW).


Dr Danielle Lottridge
(University of Auckland, NZ)

Title: Evaluating XR Experiences


Abstract: You've created a compelling XR experience, how do you know that it's doing what you want it to be doing? In this talk, Lottridge will provide an overview of evaluation concepts, practices and techniques used in Silicon Valley industry and academic research. Lottridge will illustrate possibilities with case studies and examples.



Bio: Dr Danielle Lottridge’s career spans Silicon Valley where she conducted research for Tumblr and Yahoo and released a videochat app lauded by Apple as “new apps we love”. Lottridge completed a PhD in Human Factors Engineering at University of Toronto, migrated South to do a postdoc at Stanford University in California, where she also pursued a credential in art therapy. Now a Senior Lecturer in Computer Science at University of Auckland, Lottridge combines these areas of inquiry in new therapeutic AR/VR for creativity, social interaction, and mental and physical health.


Dr Nilufar Baghaei
(Massey University, NZ)

Title: Designing Individualised Virtual Reality Applications for Supporting Mental Health


Abstract: Mental health conditions pose a major challenge for individuals, healthcare systems and society – and the COVID-19 pandemic has likely worsened this issue. According to the Mental Health Foundation of New Zealand, one in five people will develop a serious mood disorder, including depression, at some time in their life. Co-designed solutions to increase resilience and well-being in young people have specifically been recognised as part of the National Suicide Prevention Strategy and the New Zealand Health Strategy. Virtual Reality in mental health is an innovative field. Recent studies support the use of VR technology in the treatment of anxiety, phobia, and pain management. However, there is little research on using VR for supporting, treating and preventing depression. There is also very little work done in offering an individualised VR experience to improve mental health. In this talk, we present iVR, a novel individualised VR experience for enhancing peoples' self-compassion, and in the long run, their mental health. We describe the design and architecture of iVR and outline the results of a feasibility study conducted recently. Most participants believed introducing elements of choice within iVR enhanced their user experience and that iVR had the potential to enhance people’s self-compassion. Our contribution can pave the way for large-scale efficacy testing, clinical use, and cost-effective delivery of intelligent individualised VR technology for mental health therapy in future.


Bio: Dr Nilufar Baghaei is a Senior Lecturer in Computer Science and the Director of Games and Extended Reality Lab at Massey University. Her research interests are Game-based Learning, Augmented and Virtual Reality, AI in Education and Persuasive Technology. Nilufar has published extensively in prestigious international journals and conferences and is acting as an associate editor, guest editor, editorial board member and a programme committee member of several journals and conferences including International Journal of Medical Informatics, Frontiers in Digital Health, Games for Health, Frontiers in Virtual Reality and Smart Learning Environments. Before joining Massey, Nilufar held multiple leadership positions at ITP sector, and prior to that, she was a researcher at CSIRO, Australia. She received her PhD in Computer Science from the University of Canterbury in 2008.


Industry Panel

Melanie Langlotz
(Geo AR Games)

Bio: Melanie Langlotz has spent 20 years in the Film and TV industry as a VFX Artist and online editor. While on the lookout for the next media platform, she decided to move into Augmented Reality and eventually founded her own game design studio Geo AR Games in 2015. Geo AR Games flagship product is Magical Park, the world's 1st AR playground that gets kids off the couch and physically active outside in a park using mobile games. The studio takes government messages and gamifies them to help with community education in areas like environmental science and emergency management. The studio is based at Grid Auckland and likes to collaborate with other Start-ups. One of those collaborations led to Geo AR Games' involvement in the marine education industry combining sensor technology, Mixed Reality and animatronics.


Amber Taylor
(ARA Journeys)

Bio: Amber Taylor (Ngāti Whātua, Ngāpuhi, Tainui, Ngāti Mutunga, Te Āti Awa) is the CEO of ARA Journeys, a company she co-founded in 2018. Inspired by and drawing on indigenous knowledge, ARA's digital platforms utilise AR, MR, XR and AI to achieve the key objectives of connection, education, and exploration. Their work spans five key sectors: Environment, Health (physical & mental), Education, Tourism and Information Technology. ARA was a finalist in the 2019 NZ Hi-Tech Awards and Highly Commended finalist of the Callaghan CPrize Challenge.

Jessica Manins
(Beyond, NZ)

Bio: Jessica is a creative technology producer making VR experiences that make you smile. Working in both the arts and tech for 20 years she started working in VR 5 years ago. Since jumping head first into this emerging sector she built New Zealand's R&D centre for VR, made a bunch of 360 films and led the development of a VR therapy application that saw her become a 2017 Wellingtonian of The Year finalist for science and technology. For the past 3 years she has focuses on social VR gaming at Beyond. Their first VR LBE game Oddball is out now at Two Bit Circus in LA and in beautiful Wellington, NZ. Beyond measures success by the number of times people laugh out loud during a game. So far it’s been a big success. Jessica is passionate about forward-thinking leadership, loves being involved in start-ups and thrives in a supportive culture that gets a thrill from change, problem-solving and high growth.


Dr Roy Davies
(University of Auckland, NZ)

Bio: When Dr. Roy Davies started with VR in the late ‘90s, very few people knew what it was. However, the University of Lund in Sweden was determined to pioneer the use of this promising emerging technology in a variety of human-factors related applications such as brain injury rehabilitation, and participatory design; evolving one of the first Mixed Reality systems ever built. This culminated in Scandinavia’s then largest VR Research Centre with several large-scale ‘CAVE’ systems. After returning to NZ, Roy has been a consultant, built a couple of Mixed Reality companies, advised many students and businesses, and developed a unique Mixed Reality, agent-based platform to explore alternatives to the ‘desktop metaphor’. Most recently, Roy has started at the University of Auckland and is looking forward to helping break new research ground.


Sakthi Priya Balaji Ranganathan
(JIX Reality, NZ)

Bio: Sakthi is a hands-on entrepreneur with expertise in digital design, cross-functional team development, human-computer interaction (HCI), and extended reality technology. Through creative foresight, drive, and resoluteness, he built JIX Reality as high-growth startup research focused on commercializing extended reality technologies (VR, AR, MR) for the New Zealand market. Sakthi is a ‘maker’ at heart who is not afraid to roll up his sleeves and push the limit of what’s achievable digitally. JIX Reality was born in 2017 when he noticed a void in applying XR tech research to real-world problems. As a self-sufficient innovator, Sakthi is skilled in creating a vision and leading teams from the start of the project to explore new models and implementation approaches while keeping the focus on the return on investment.



James Everett
(NZXR)

Bio: James is a Co-founder and Director of Wellington based NZXR Ltd, a collective of XR experts who focus on solving hard problems with great experiences on the latest hardware and software. A game designer for nearly 20 years he has worked on a wide range of titles, from award winning brick breaker Shatter to AAA blockbuster Splinter Cell Blacklist. Prior to co-founding NZXR Ltd James spent 5 years at Magic Leap alongside a talented Weta Workshop team, making world-first experiences like Dr. Grordbort’s Invaders and Boosters for the Magic Leap One.