PARALLELS: Analog vs. Digital

1. A game which crosses the boundary between digital and analog

Hidden from our eyes is a digital world that we cannot see, where viruses roam freely. However, the border between the digital world and the physical world is slowly eroding. A virus has managed to escape and is starting to invade the analog world. As humans, we have to work together in order to stop the virus from taking over the world.

Figure 1: Digital vs Analog logo originally used for the teaser on the NMNT website.

The analog era is often looked back on with a certain romanticism. It is a time when things were simpler for many. We want people to experience this dissonance and parallels between the digital and the analog world and the real and the virtual environment. 

The constant rise of digitalization in the world around is seen as a threat by many people. In our project we want to enable people to fight this threat by working together in the real world. Ironically, to do this, we are using cutting-edge digital technology. 

By constantly growing towards us the virus has become a metaphor for this digitalization process threatening the player and their ship. By combining the real analog world with a digital game, the humans, into one project we have tried to capture the tension that lies between these parts as well as the constant need to fight for a healthy balance between the two components of our modern life.

2. Interactions at the exposition

During the event, we received a lot of visitors who were enthusiastic to try to play our game. People were eager to not only try the digital game, but also to interact in the physical space.

Figure 2: Overview of the exposition setup

Now, one interesting interaction happened when a group of children started trying the game on the second day. For one, they were quick to pick up the digital game, and quickly getting rather good at it. More relevantly, the children started playing collaboratively heavily, asking each other to help them in the physical space when they were playing the game, trying to work together to gain the highest scores. They even invented gameplay patterns which were not actually supported, for example by running closely around the lights with their hands nearby. And their methods worked: at the end of the exposition, 7 out of the top 10 scores were attained by this same group of children.

Figure 3: A child playing the game.

3. How the game was built

The system was essentially composed of 4 separate components: the game itself, the light controllers, the person tracking system, and a webserver to enable communication between these parts. In the sections below we will go into considerable detail for each of these parts. If at any point there is some confusion we would like to point you to the accompanying GitHub repository: https://github.com/MGelein/humansvsaliens.

Each of the parts has a separate folder in that repository, and most of the code should be commented. All of the components communicated over a wireless network that we setup using a router. On that router the server had a static IP address, which allowed the other components to always find it, making our jobs easier since we could just hardcode the server address. 

3.1 Playing a digital game

The digital game could be considered the primary interaction, since without it none of the other parts would have a function. However, it should be noted that without the other components the game would be just that: a game. A retro 8-bit aesthetic was chosen to capture the nostalgic feeling that most people have towards old analog equipment.

Figure 4: The opening screen. Normally this would be filled with the top 5 highest scores.

This screen was meant to lure people in and demand attention from a large distance.

All of the game was coded using Processing, using nothing more than the Sound library provided by Processing itself. The entirety of the game, both the engine and the content was coded by us. Let’s talk about the subsystems of the game: The most obvious subsystem is the virus mechanics. To simulate a growing virus we decided to use a cellular automata algorithm that we came up with and tuned ourselves. The algorithm went through multiple phases. Some of the buggy or unintended behaviour of the algorithm turned out to be a great glitchy effect and was kept for future iterations. This means that tuning this algorithm is a very finicky process, since there are multiple variables, all dependent on each other, while they may also have unforeseen side-effects.

Figure 5: A still of the game. The ship is currently mirror controlled. In the virus layer

you can see two dots, signifying tracked players in the playing field.

Each of the cells in the simulation mentioned above had a certain amount of ‘health’. This would influence its speed of procreation, as well as the likeliness of it dying. To implement the person tracking as well as the shooting of the spaceship another invisible layer was added on top of the virus, which was called the Salt layer. This was an offscreen buffer that we could draw to. Every update the virus would check the brightness values its corresponding pixel on the salt buffer and subtract that value from its health. This made it possible to render explosions into the salt buffer which would kill the virus at that spot. The input from the person tracking would also be drawn into this layer.

An important component of the game was the virus taking over the game making it harder for the player to fight the virus. Every control point that was taken over would give a nasty effect to the player. Each of these effects was always the same, allowing the player to learn them so they could perform better next time. The effects were, in order: slow shooting, mirrored controls, static screen noise, and screen shake. Once all control points were taken over by the virus the game was lost and the accumulated high score would be sent to the server for registration.

The GUI, game state engine and most other components are not worthy of much discussion. While there is a lot of code needed to make both these processes run smoothly, not much of it is groundbreaking, new or even slightly exciting. The entirety of the game is rendered onto an offscreen buffer, which allows us to scale that buffer up in the final output, causing the nostalgic pixelated look. Key input is parsed by a separate homemade library, Sound is played and loaded through a manager class, while the Ship class implements a simple physics based control scheme for the player controlled ship at the bottom of the screen. A simple Network module requests data from and posts data to the server by using simple GET requests. These requests were received as .CSV, and consequently parsed into the appropriate data, such as person coordinates, high scores, etc.

As a small bonus component to the game we created a high score viewer. A simple Processing application that would render the top 10 high scores using the same style as the game. This simple application was merely a readout from the server and thus contained very little code, except for rendering. Since this was a bonus component, it was not an integral part of the experience. It was a nice-to-have in case we happened to have a spare laptop available to run another component.

3.2 Showing progress using lights

Since we wanted to cross the boundaries of physical and digital we needed a way for people in our physical playing field to be able to follow along with the game. That is where our control point lights come in. We had five identical modules that we will describe in this section. Each of these lights would slowly turn from green (good) to blue (virus color) as the virus progress from the back of the playing field al the way to the front where the main player was playing the game behind a screen. The five modules were intended to give the players in the field a reference to where the virus currently was, as well as to draw people into our project by providing an interesting and attention-grabbing design.

Figure 6:The original model options of the lights.

Since every beacon was identical we made the following five times. Every light was controlled by NodeMCU with a ESP8266 module, which gave it internet access. Using this internet access it would periodically (every 500 ms) ask the server what its lights should look like. The server would respond with a number from 0-1, which indicated how ‘virusy’ the light should be, how far it was along a gradient from blue to green. Each of the lights had a separate ID and the server would calculate its virus-factor for each independently.

This NodeMCU was powered by a small portable USB powerbank. By making the project completely wireless we have great freedom in placing our beacons. Technically we can follow any interesting contour of the space we would place this piece in, allowing us to really fit the environment that was chosen.

The lights were a ring of 12 neopixel leds, controlled by the NodeMCU. The microcontroller would ease between color values to make the slow rate of updating less apparent. The whole assembly was fit in a laser-cut wooden box that was spray painted black. Special care was taken to make sure the lights would match the unique retro feel of the game.

Figure 7:The prototypes of the lights.

During development we encountered some frustrating networking bugs. For some reason the lights would never be fully stable and after some time running, somewhere between 30 seconds to 3 minutes, it would just stop communicating with the server. This behaviour was not acceptable and after much debugging it was decided to just code in a software reset every time it lost connection for more than five consecutive attempts. A careful observer might have seen one of the lights flashing briefly during the game. This indicated a reboot of that specific beacon. Luckily after this fix the control points worked flawlessly throughout the weekend.

3.3 Enabling people to interact in the physical space

The person tracking system was set up by combining two models: object detection and depth estimation. To figure out where people are in the game area, a neural network model which was trained to detect people and other similar classes, was run. Specifically, a model based on the Single Shot MultiBox Detector [1] architecture was used, which provides a great tradeoff between accuracy and inference speed. Specifically, the Repulsion Loss SSD implementation [2] in PyTorch was used.

Secondly, we wanted to determine the distance at which people were located. For this, we decided to use a 2D depth estimation model. The Monodepth model as developed by Niantic Labs [3] can take plain RGB images, without depth information, and estimate the depth value of every point in the image.

These two models were then combined, where the position along the x-axis of each detected person was scaled from 0 to 1, and then used as the x value, while the depth value of the top-left pixel of each detected person was inverted and then also scaled from 0 to 1, and used as the y value. This led to the creation of a near-real-time 2D map visualization of the people in the game area. This can run on the Macbook Pro’s i7 CPU at about 1 FPS, which was fast enough for our purposes.

Significant amounts of time were used to try and make the model run on another laptop with a GPU, which would theoretically increase the FPS dramatically. However, we could not make this work before the expo started, so we decided to stick with the method that had proven to reliable work.

3.4 The Communication Hub: Our Server

To make all these components that we have described so far work together we needed one central hub that would arrange the communication. For this purpose we ran a small node.js server on the same laptop that was running the tracking algorithm, since the process requirements for this server were so minimal to begin with. The server provided several endpoints that all other components could use to post or retrieve data. 

The functionalities of this web server include retrieving the progress of the current game, transferring this into states of the lights, storing the high scores values, storing the detected people state and sharing this with the game, and caching all of this on the file system to make sure that we did not lose any data.

References

[1] https://arxiv.org/abs/1512.02325

[2] https://github.com/bailvwangzi/repulsion_loss_ssd

[3] https://github.com/nianticlabs/monodepth2