Monica Gori (Italian Institute of Technology)
We live in a multisensory environment in which vision, audition, and touch collaborate to improve the quality of the interaction with the environment. Multisensory signals are moreover integrated to enhance the quality of our perception. During the development sensory signals interact and are calibrated with each other. In this talk, I’ll present our results on how multisensory development and technology can be a support for children and adults in the fields of rehabilitation, edulation, and game.
Georgina Powell (University of Cardiff, UK)
Multi-Sensory Environments (MSEs) are widely used in to support adults and children with autism and intellectual disability, and are commonly found in schools, hospitals, and community care settings. They are adaptive spaces containing a range of equipment designed to stimulate the senses. Despite their widespread use, very little research has explored the benefits and limitations of using the rooms or provided evidenced-based research on best practice. At the Wales Autism Research Centre, we are conducting a programme of research exploring practitioners’ beliefs and experiences of using MSEs and combing this with observational and experimental studies within a custom built sensory room. We have found that practitioners believe that MSEs provide a number of benefits to autistic pupils that could support learning. These include increased positive mood and attention, reduction in challenging behaviours, and facilitation of social interaction and relationship building. Our experimental and observational studies found that a child being in control of the sensory changes within the room was important for facilitating some of these positive outcomes. In ongoing research, we are investigating the use of sensory spaces and equipment within supported living environments for adults with autism and intellectual disability. We are interested in whether smart technologies could be used to support sensory needs and wellbeing within a home environment, building on an ongoing programme of work trialling the use of smart technology in social care settings.
Irene Valori (University of Padova, Italy)
Neurodiverse bodily self experiences are early markers of heterogeneous neurodevelopmental conditions (such as Autism) and seem to be under-targeted in research and clinical approaches. Multimedia technologies and virtual reality offer unique possibilities to manipulate sensory, motor, interpersonal, and cognitive processes contributing to body ownership, location and agency. I will present some practical examples of multimedia and virtual reality activities we have designed and tested with children in the past years, aimed at fostering exploration and learning through multisensory and bodily experiences, which are crucial for building up a coherent sense of self and lay the foundation for interacting with the external world.
Giacomo Vivanti (A.J. Drexel Autism Institute, Philadelphia, USA)
Children diagnosed with autism are capable of learning vast amounts of material in specific areas - yet, they often show learning delays across multiple domains. Additionally, they show an intact ability to learn from the outcomes of their own actions, but difficulties learning from others’ actions and communication. We will address the nature of these puzzling phenomena by examining recent research on early learning processes in autism. Using innovative methodologies, research is unveiling the mechanisms through which early learning in typical development is built on the child’s self-directed engagement with their social environment, and how in autism this process is altered by differences in early emerging attentional, motivational, and cognitive processes that support engagement with people and objects. Implications for adapting clinical and educational interventions to the unique learning style of children diagnosed with autism will also be discussed.
Merle Fairhurst (Universität der Bundeswehr, Germany)
In this talk, I will discuss how humans can feel connected either in space through physical touch, like a handshake or a hug, or in time through coordinated activity like group music making. In both cases, an exchange of rich sensory information allows interacting individuals to derive a sense of co-presence. In the first part of the talk, I will start by presenting my account of co-presence and in particular how it relates to sensory experiences in virtual reality (VR). VR technology offers unique ways to run empirically controlled experiments that approximate rich, multisensory social exchanges. As such, in the second part of the talk I will present my recent work using VR to investigate the dynamics of group interactions (in this case in marching or walking together). In particular, I will detail how we can measure both objective and subjective coordination at a group level. Moreover, I will share how variations of this design could extend previous work done investigating interaction dynamics in autistic people. I will then present our recent work using VR to investigate affective touch and in particular how this approach is helping us to probe the factor of context (e.g. within a medical context or by a family member or friend). I will conclude by discussing how VR can be used to promote a sense of connectedness through co-presence as well as the potential of these approaches to better understand and the ways in which autistic people can feel connected to others.
Sean Lynch (University of Tokyo, Japan)
The motor contagion theory refers to an involuntary behaviour as a result of perceiving another behaviour by a second individual. Here we propose a technical approach that can be used to consider studying the sense of body ownership and presence of motor contagion, using a humanoid robot along with virtual reality head-mounted displays. Preliminary findings from fourteen participants who mimicked movements of the robot congruently or incongruently in different orientations (face-to-face and first-person) are presented. Congruent consisted of following the motion of the robot, while incongruent was to produce a movement in the orthogonal direction. We measured the variance of the motion made by the participants during these tasks. The results showed a greater variance while embodied in the robot during an incongruent motion when compared to the face-to-face incongruent condition. This study presents a novel approach for future considerations of research that can include the investigation of motor contagion in autism, while additionally considering multisensory integration from a first-person perspective.
Letizia Della Longa (University of Padova, Italy)
In a virtual reality (VR) environment, social interaction occurs between representations of others made accessible to the senses via technological devices. Being primarily visual and auditory, the VR experience is usually impoverished of touch, a communication channel with unique potential for social interaction and connection. Over the last years, researchers tried to bring touch into VR, raising interesting questions about the impact of virtual tactile interactions on the perceived virtual version of oneself identity and the capacity to create self-other meaningful connections. In particular, a new challenge is to develop multisensory and more immersive social VR platforms to reshape sensory and bodily experiences and social connection for people with atypical development and unique sensory functioning, such as autistic individuals for whom real and virtual experiences might have distinct sensory implications. Autistic individuals show reduced sensitivity to visual manipulations of body self, which would limit the possibility of intervention through the visual channel, while they seem to heavily rely on somatosensory cues, of which touch is particularly powerful in connecting the self and the other. The potential of leveraging tactile inputs in VR to foster the bodily illusion and stimulate social connectedness of autistic people represents a new and challenging field of investigation. Future research could explore innovative ways to adapt tactile stimuli to the individual’s functioning and needs to re-shape sensory thresholds, bodily perceptions and social engagement with long-lasting effects that generalise to the real world.
Jane Aspell (Anglia Ruskin University, UK)
There is some evidence that disordered self-processing autistic people is linked to the differences in social behaviour characteristic of the condition. To investigate whether bodily self-consciousness is altered in autism spectrum disorders as a result of multisensory processing differences, we tested responses to a VR-generated full body illusion in 22 autistic adults and 29 neurotypical adults. In the full body illusion set-up, participants wore a VR head-mounted display showing a view of their ‘virtual body’ being stroked synchronously or asynchronously with respect to felt stroking on their back. After stroking, we measured the drift in perceived self-location and self-identification with the virtual body. The results showed that participants with autism spectrum disorders are markedly less susceptible to the full body illusion, not demonstrating the illusory self-identification and self-location drift shown in neurotypical participants. Strength of self-identification was negatively correlated with severity of autistic traits and contributed positively to empathy scores. These results suggest that bodily self-consciousness is altered in autistic participants due to differences in multisensory integration, and this may be linked to differences in social functioning.
Alice Tennant (Ulster University, UK)
Overwhelming sensory experiences are a barrier to the autistic participation in daily-living activities. It has been identified that neurological differences of autistic people lead to distinct sensory experiences, compared to those without autism. In short, this is not something we should attempt to cure or modify. Rather, we should aim to understand, accept, facilitate, and integrate autistic people into our society by making it more accessible. Barriers to participation in daily-life activities are significant as they can impact an individuals’ ability to successfully access education, healthcare, travel, employment, and independent living. Little is known about cognitively experienced sensory barriers in the environment, or how they could be managed to enable access to the same products and services that the non-autistic population use with ease.
The autistic community show both great skill, and talent, as well as a great desire to be more active members of the population, however employment rates in the autistic population are just 16%. Prioritising access to public services, ways to improve life skills such as the ability to ‘prepare for things that scare me in the real world’ or ‘go to places that I am unsure of in real life’, are things they want help to achieve, with sensory environments cited as one of their main obstacles.
At the time of writing, no interactive tools for an autistic individual to preview and understand the sensory environment have been found. Existing sensory studies aim to assist with language, travel planning, contextual explanation, sensory regulation, and the education of non-autistic population on autistic sensory experiences. None, however, provide a tool for the autistic user to plan and practice their activity within varying sensory environments. As each experience is distinct for each person and each situation, exposure to high fidelity cognitive load in a controlled way is required. While accessible design has been well established there is very little research into cognitive accessibility within the design and implementation of immersive interfaces and experiences.
This research hypothesises that, through human-centred design, a mixed-methods approach and participation from the autistic community, an appropriately designed tool to support access to daily-living activities can be realised. This work will aim to establish a temporal database of information about; sensory stimuli within an environment over a period of time, feedback on sensory barriers identified by autistic people, a framework for surveying the sensory environment, and a guide to cognitively accessible interaction design within the immersive context.
Matthew Schmidt (University of Florida, USA)
As a major effort in promoting participatory and inclusive design that involves autistic individuals throughout an iterative design and evaluation process, Project PHoENIX (Participatory, Human-centered, Equitable, Neurodiverse, Inclusive, eXtended Reality) focuses on autistic adults who are transitioning from secondary education into adult life. In the current study, we seek to explore autistic adults' perceptions of commercial, off-the-shelf virtual reality tools and how autistic users believe these tools might be useful for them and responsive to their needs and preferences. The PHoENIX VR environment was designed using learning experience design (LXD) methods and processes (i.e., empathy interview, empathy mapping, personas), and is undergoing a process of continuous refinement to promote inclusion, relevance, and positive learning experiences. The current version of the VR environment consists of three individual spaces, namely, Mozilla Hubs Training for Autistic Users, The VR for Autism Gallery, and The Social VR Tools Gallery. Each of these spaces has unique features and objectives aligned with project goals.
Franca Garzotto’s group (Politecnico Milano, Italy)
Autism refers to a broad range of conditions characterized by differences in social skills, repetitive behaviors, speech, and nonverbal communication. To overcome these limitations and improve the quality of life of autistic persons, we developed 5A. 5A provides innovative tools based on smartphones and wearable headsets that integrate Virtual Reality (VR), Augmented Reality (AR), and Conversational Agents (CA). These applications help Autistic persons to understand the environmental and socio-organizational characteristics of everyday life scenarios, especially those related to urban mobility, and to perform related tasks correctly. Experiences in VR allow simulating typical tasks of such scenarios, e.g., taking the subway. When people find themselves in similar situations in real life, AR helps them generalize the skills acquired in VR by superimposing interactive multimedia elements on the view of the surrounding physical world. In both VR and AR, a personalized Virtual Conversational Assistant provides prompts and personalized feedback. The first developed use cases focused on autonomy in urban mobility, and here we will present the design and development process of the metro scenario.
Emily Isenstein (University of Rochester, USA)
Tactile hyper-sensitivity is a common feature of autism, wherein physical contact can be overwhelming and aversive. The physical, cognitive, and social implications of this sensory difference are not well understood, particularly in the differentiation between how active, self- initiated touch is processed compared to passive, externally-initiated touch. While external tactile input is often undesirable or even painful, many autistic people seek out self-stimulatory tactile sensations, such as finger-tapping or head-banging. Autistic adults report that these predictable, self-generated actions play a role in their self-regulation of overwhelming emotional and sensory experiences, but these behaviors face substantial stigma and remain understudied. Little research has objectively investigated how the type of somatosensory stimulation affects the neural mechanisms underlying tactile sensitivity. Using a combination of electroencephalography (EEG) and virtual reality (VR), this project will assess whether the source and predictability of tactile stimulation influences the cortical sensitivity to that sensory input.
Elliot Millington (University of Glasgow, UK)
Participatory research is the deliberate inclusion of autistic people in the research process. While there are degrees of participation, the most powerful projects include autistic people from the project’s conception to its dissemination. Engaging in participatory practices reduces the power imbalances present in academia, encouraging better quality research that better fits the priorities of the autistic community. Virtual Reality is a novel technology, bringing with it plenty of excitement. This has led to interdisciplinary teams developing applications intended to intervene in the lives of autistic people, but without lessons learned from autistic voices. Virtual Reality projects are perfect for participatory methods as accessibility issues common for autistic people are novel and may require novel solutions, as well as rethinking research directions.
Sarune Savickaite (University of Glasgow, UK)
Virtual Reality is now transitioning from a novel to established technology. This has been accompanied by plenty of researchers trying to use VR with varying degrees of effectiveness. Autism research has been no exception to this. Sarune will be joining us to talk through an upcoming review she has written looking at VR in autism research, describing where research efforts are being directed and how the studies can be improved.
Sonny Russell (Children’s Hospital of Philadelphia, USA)
In the USA, one fifth of adolescents with ASD will be stopped and questioned by police before their early twenties. Providing opportunities to practice this kind of encounter is essential as police interactions are challenging, unexpected, include unusual sensory stimulation, require novel problem solving, and necessitate rapidly processing social situations in real time. The COVID-19 pandemic has rendered it near-impossible to safely implement in-person behavioral interventions. Virtual reality (VR) is uniquely poised to smoothly translate to a remote administration and can be a useful tool for practicing police encounters in a safe, simulated environment. In this talk, I will describe how the Floreo Police Safety VR intervention was implemented via teletherapy and provide preliminary evidence for the feasibility of this approach with autistic adolescents and adults.