Extended Reality (XR)

Holographic AR/MR/XR for Humans-System-Interfacing

Augmented reality (AR) has been an integral part of our research in CPS for human-system-interfacing, including human-on-the-loop (HoL) and human-in-the-loop (HiL) MRI-based robot control. Indeed, AR has been used extensively in the MIROS CPS project, when it was first introduced the concept of updating the AR scene (spatiotemporally co-registered renderings and the actual MR images) based on MR images as they were collected, i.e. on-the-fly.


During the MR-Bots CPS project, our team further developed AR interfaces by incorporating computer-generated-holograms (CGH). These CGH offer some unique opportunities in immersing the operators of a CPS into 3D or 4D information. With the advent of head mounted displays (HMD) and AR-enabled hand held devised (HHD) such multi-dimensional and multi-modal immersion have an ever-growing role in Radiological sciences and surgical/interventional practices.


Our work is based on our in-house developed software framework FI3D and we investigate holographics on HMD using HoloLens II, as well as on HHD running iOS (iPads, iPhones) and Android using Unreal Engine.


Vision & Motivation: We pursued computer-generated-holograms for the sole purpose of effective and intuitive interfacing to sensors, data, processes and HiL/HoL control of effectors of image-based CPS.

Are these true holograms? No, of course not (in the eyes of this Physicist who was taught laser generated holograms)! However, while these computer-generated-holograms are not true holograms, one may claim that they do fit to the original meaning of the word Hologram. Indeed, originating from the combination of two Greek words ὅλος/ὅλο (*) (entire/entirety – same root with the words “holistic” and “whole”) and γράμμα/γραφή (writing or recording). Since those computer-generated pseudo-holograms enable to user to have a 3D appreciation of a 3D structure, we can then use the term hologram to describe them.


Endowing our CPS human-system interfaces with interactive holographic means has resulted to the below themes we are actively pursuing.

(*) Εἶμαι ἀπόλυτος θιασώτης τοῦ πολυτονικοῦ συστήματος, καὶ δεὶ τοῦ εὔηχου τῆς ἑλληνικὴς γλώσσας…. (Το λοκνταουν δεν ητανε κουλ)

Multi-Site & Multi-User Shared Holographs

Motivated by the "Covid-tech fever" and inspired toward "co-actions" and synergy:

  • improving STEM education

as well as with our Trans-Atlantic colleagues:

  • co-processing 3D/4D imaging data

  • co-operating MRI scanners and

  • co-servoing robotic devices

we have embarked on yet another journey: using our FI3D framework enable users/operators/trainees/educators in different location and with different platforms to share and interact with the same holographic scene

The videos below show tests of the multi-use & multi-site FI3D module for the synergistic interactive visualization of a holographic AR scene that contains cardiac CINE MR togerther with rendering on the LV endocardium and epicardium extracted from this CINE set. (Left) 1xHoloLens 2 and 1xiOS; note how the operator's hands are rendered in the HaloLens view. (Right) t) shows three users at three different physical locations with 1xHoloLen, 1xiOS and 1xAndroid sharing and interacting with the same hologram.

Cardiac Multi-User.mp4
HoloMulti-All3.mov

Tele-Scanning: Interactive remote MRI-scanner control

via Holographic with Augmented Reality

Under the highly evangelized and, quite meritorious indeed, ever-growing practice of enhanced point-of-care, we further investigate the multi-site deployment of imaging and interventional systems. Under this light, in collaboration with Dr. Kirsten Koolstra and Prof. Andrew G. Webb, we recently developed and present a version of

Interactive remote MRI-scanner control via Holographic AR Interface

whereby a two way communication is established with the scanner and the radiologist or interventionalist for data processing and scanner control on-the-fly (while the patient is inside the MFR scanner). Using the FI3D framework, an operator in Houston receives the images as collected at a MR scanner in Leiden The Netherlands, process them, segments them and updates a resending of the structure and based on it commands the scanner to collect complementary data (changing, orientation of slices, contrast etc! Check the video below.

Bi-directional data and command flow between the MR site in Leiden (The Netherlands) and Houston (USA). The operator in Houston uses a HoloLens to both visualize the reconstructed AR scene and control the scanner (grasp & voice)

2022ISMRM_HoloMRI.mp4

MRI-based Surgical Planning with Holographic Augmented Reality

We developed Holographic AR interfaces, initially on HoloLens 1 and currently on HoloLens 2 using Unreal Engine fror the AR scenery with all processing, data management and communication based on our framework FI3D. Early versions developed for the MIROS CPS clearly illustrated the potential, as well as the limitations of the available HMD and HHD technologies. For our CPS e needed, on the fly:

  • interact with the MR scanner and generated data,

  • customize and interactively use conventional and ML-powered gadgets to process them,

  • generate & update AR scenes with renderings and images

  • interactively plan interventions and

  • run robots

And, all these processes performed via a HoloLens 2 that is used as the IO device for the sought human-centric interface.

We have developed and investigated such holographic AR interfaces with HoloLens for visualization of

  • Neurological Interventions

  • Cardiac CINE MRI with LV renderings based on segmented C INE images using Fully Convolutional (FCNN) and UNets.

Cardiac AR.mp4
Cardiac Demo.mp4

In Silico "MRI-Servoing" of a Prostate Robot

MRI-servoing control of interventional robots was pursued as early s in the MIROS project. However, it was the advent of holographics and the development of theFI3D framework that enabled us to integrate processes and operators in a streamlined way. In close collaboration with Nikhil Navkar, we developed modules for robot control and performed in silico user studies for MRI-guided prostate interventions.

ProstateBiopsy AR.mp4

The different aspects of this work were supported in part by the National Science Foundation Grant CNS-1646566. All opinions and conclusions or recommendations expressed in this web page k reflect the views of authors not our sponsors.