Chapter 4: Audiences & Accessibility

by Sara K. Johnson27 April 2022(Above: the wheel that allows players to select different emotes in Fortnite)

In 2020, in the advent of the COVID-19 pandemic, the Monterey Bay Aquarium began streaming Nintendo’s wildly popular Animal Crossing: New Horizons (ACNH) on their Twitch.tv and YouTube channels to expand its virtual offerings to visitors. They invited aquarium staff, scientists, and marine biologists to join their stream sessions and use the video game to have conversations about the marine life already part of ACNH in comparison with real-world aquarium collections. “While playing ACNH, a member of our social media staff noticed that many of the fish players collect are species found in Monterey Bay or have connections to our conservation work – and had the idea that they would make a perfect virtual exhibit hall of the aquarium during our closure [during the pandemic]” (Allen-Greil & Simpson, 2021, p. 100). Although streaming a video game doesn’t qualify as participating in the metaverse, the Aquarium’s community engagement practices for their streams offer insights into how to manage synchronous online communities.

Community Guidelines

The metaverse, or at least where it seems like the metaverse is headed, is largely unmoderated and somewhat of a free-for-all, depending on the platform. If a museum wants to create a metaverse experience, it may be in their best interest for the safety of their virtual visitors to establish some parameters for public discourse, conversation, and interaction. While hosting their Twitch.tv streams, the Aquarium established rules that people joining the live chat had to follow that were closely linked to the Aquarium’s Social Media Community Guidelines “to be sure that people would feel safe and included (and so we would feel comfortable enforcing those rules)” (ibid, p. 103). Monterey Bay Aquarium also recruited several volunteer channel subscribers to be moderators for the channel during livestreams. “Having real human moderators in tandem with [an automated one] helped us to engage with our audiences more directly during streams and helped create a welcoming and inclusive environment on our channel” (ibid, p. 104).

In some cases, creating community moderation may require working directly with the platform that is hosting your museum metaverse experience. One of the features that metaverse users enjoy is being able to use animated emotes with their representative avatars. This could take the form of a simple gesture like a hand-wave, or something more expressive like dancing, playing air-guitar, and waving a banner. However, not all emotes are appropriate for all contexts.

When Fortnite’s March Through Time experience first launched in Fortnite (read more on this in Chapter 2), players were able to use a set of 8 new event-specific emotes that Epic Games created, such as a sitting emote, protest emote, and others. Unfortunately, Epic Games also made all other emotes in the game available for players to use at launch, such as the “Gangnam Style” dance, make it possible for players to dance around the DC’s Lincoln Memorial Reflecting Pool while dressed as Rick Sanchez from the popular cartoon Rick and Morty, with Dr. King giving his “I Have a Dream” speech on the large screens in the game.

24 hours after the launch of the event, Epic Games announced that it was disabling all emotes except for the 8 event-specific ones from the March Through Time experience. “It should be noted that Epic actually disabled some toxic emotes, including the ability to toss tomatoes, right at the launch of the event. So it seems the publisher was aware that players using emotes distastefully was a potential problem, making it odd that the company even waited 24 hours before deactivating all non-approved emotes” (Zweizen, 2021).

Accessibility

Companies and developers from the tech side of the metaverse are also concerned with accessibility options, especially as some hardware such as VR headsets often have accessibility issues. Using a device capable of bringing a user into the metaverse requires gestures, pinching, waving, and other physical hand movements that someone with motor or dexterity impairments would find difficult or impossible. Accessibility in the metaverse mirrors the concerns of creating accessible spaces in the physical realm. “Even an immersive metaverse experience with lively quests and 3D replicas of your products should have simpler options, like a basic ecommerce website with photos and text descriptions” (Harder & Pinas, 2022).

Avatars are another topic to consider with accessibility in the metaverse. “What of disability identity in a world where someone can morph into any avatar of their choosing? Will people with disabilities shun their real-world identity and cast away their chairs and crutches to assume a more dominant athletic form or loudly and proudly announce their identity as an individual with a disability?” (Alexiou, 2022).

“One challenge will be that the metaverse, by its very nature, is intended to be a visually richer world than what we have today. This will pose further complexities for blind and low vision users. One feature that may be able to offset some of these limitations is the smart use of touch and haptics” (Alexiou, 2022). Audio is an important factor for metaverse visitors who are low-vision or visually impaired. Being able to move around and look around in 360-degrees means that the experience must include localized, binaural sounds – something that is not currently possible on some platforms, such as Decentraland.org (DCL).

On December 9th, 2021, the Toledo Museum of Art (TMA) launched an immersive experience in a metaverse platform called Decentraland (DCL). The experience coincides with an installation in the real-world galleries of Stan Douglas’s 2019 film called Doppelgänger. Watch a video on the left to learn more about the installation from Curator of Contemporary Art, Jessica Hong.

In Doppelgänger in Decentraland, “avatars are immersed in a starfield as seen in the film, then teleport into, and move in and out of its architectural and cosmological spaces, including the meticulously detailed observation room, Alice’s examination room and mission control. The complex narrative arc is revealed in layers, transitioning between the two versions and rewarding multiple viewings” (“Toledo Museum of Art Offers Viewers New Experience of Artist Stan Douglas’ Film Doppelgänger in the Metaverse,” 2021). The DCL Doppelgänger experience will only be live until May 15, 2022.

In March 2022, I spoke with Nate Sloan, Senior Technical Support Specialist; and Jessica Hong, Curator of Modern & Contemporary Art at the Toledo Museum of Art who both worked on Doppelgänger. Hong and Sloan helped spearhead the project within the museum, and collaborated with the artist, Stan Douglas, and creative development studio Parameta and metaverse building firm Polygonal Mind.

Sloan and Hong explained in our interview that the collaborators and TMA had to come up with some creative back-end solutions to create an immersive soundscape for the exhibit, since other projects in DCL do not include sound options in this way. One important feature of Doppelgänger is that depending on which side of the exhibit you’re on, you will hear different lines of dialogue even though the two sides of the experience are mirror images of each other. Without being able to hear the dialogue, the impact and context of the experience is lost as the “sets” on each side are near-identical and don’t provide context clues on their own.

Unfortunately, Decentraland doesn’t offer closed-captioning, and it would be difficult to add multiple languages to captioning even if it was available. Captioning in a metaverse-like environment is possible, as evidenced by captions of Dr. Martin Luther King Jr.’s speech always appearing in the Fortnite March Through Time (MTT) experience, regardless of which way the player is looking. However, it is not clear whether the captioning in MTT is available in other languages.

A screenshot of the entrance to Doppelgänger in Decentraland.org. There is a huge sphere at the top of the screen that looks like an orange sun. In the background is blue sky and a line of cartoon trees. A sign in the far distance reads "GOLFCRAFT". Underneath the sun shape is a pillar of green light. Wrapped around the pillar are the words "Toledo Museum of Art, Doppelgänger by Stan Douglas"

The entrance to Doppelgänger in Decentraland.org. Users approach the green pillar and can click a floating orb to enter the exhibit experience.

An avatar in a blue shirt and black and pink track pants floats in a galaxy of stars. There is a small white sphere at the center of the screen, with an icon above that says "Enter..."

The first scene of the Doppelgänger experience, representing the main character, Alice, and her journey through outer space. Users click the small sphere at the center of the screen to teleport into the next scene.

An avatar with blue shirt and black and pink track pants stands near a cube-shaped building with two door openings. There are two colored orbs floating in front of the avatar. On the left of the screen are rows of computer desks and chairs, with large tv screens showing a rocket launch on the wall of the command center.

Avatars can visit two sides of the room; each side representing a different version of the film. Entering the rooms on the right, users can hear voiceover of the two hospital scenes from the film. The command center on the left has two mirrored halves, with different voiceover. Users use the orbs at the center of the screen or a button in the command center to return to the start of the experience, where they can choose to visit the mirrored side.

Polyarc Games’s 2019 VR game, Moss, has a character that uses simple American Sign Language to communicate with players, and captions are available that float on screen. Can I Play That?, a website dedicated to video game accessibility topics, reviewed the game in 2019 and were impressed with the full captioning for speech. However, “The one and only area we wish Polyarc had done better is in providing full captioning and not just subtitles. . . we wish they’d taken that one final step to allow for full immersion for deaf and [hard of hearing] players. The subtle buzz of flies in the forest, the sound of bird wings flapping as crows take flight, the movement in the brush when you encounter a deer and it reacts to your presence. It’s these little things that make this game what it is, and . . . we do wish these things were captioned” (Deaf Game Review - Moss, 2019). But captioning and designing a character who uses ASL to communicate shows that these efforts toward accessibility can be integrated into virtual experiences.

With current technologies, the metaverse as it exists today is not equitable for all. Haptic technologies such as gloves that create sensations of touching or holding virtual objects and textures are on the way. “The haptics market is expected to grow by $15.84 billion between 2021 and 2025, according to a 2021 analysis by Technavio.” (The Best Haptic VR Devices and Innovations for 2022, 2022). But will these haptic devices have all users in mind? Will haptic gloves be designed for users with hand abnormalities, or amputees? Will the price point of haptic devices be affordable and accessible?

Who Has Your Data?

“As the technology continues to evolve to even greater levels of sophistication and ubiquitous-ness, the volumes of very personal data that can be collected and stored is almost unfathomable. This also means that the amount of data that can be accessed and compromised will be not only increasingly vast but increasingly vulnerable as well” (Buyer, 2022). While I won’t dive into Big Data in this project, data and privacy (or lack thereof) and AI/algorithms; sharing data, disclosing to 3rd parties, tracking while not using the app/product; abundance of advertising; and eye tracking/focus attention sharing are other important issues to consider in the metaverse.

For further reading, I recommend this article from ITP.net by Lara Abouelkheir.