3.Parallels Expo: Geminoid Fish

by Marianne Bossema, Lilian Toonstra & Martijn Wester

Parallel worlds: Real living fish in the underwater world are mirrored by a geminoid fish robot in our terrene world. The two worlds are explored in parallel by the movements of both creatures: a symbiotic relationship between fish and bot.

Worlds and Actors

The actors in this piece are three goldfish Jet, Yet and Yeti, and an autonomous vehicle called Jetbot. Their parallel worlds are a fish tank and a race track with the same proportions, each with an outer border, and divided in 10 areas marked with purple and yellow lines. The three fish are being filmed by a webcam above the tank and continuously detected by an OpenFrameworks application. The area in which they were last seen is written to a text file every three seconds.

Data collection and position determination

Jetbot has learned not to cross the white lines that define his world and, for keeping track of its own position, to recognise the alternating yellow and purple lines. This was done with a neural network that was accompanied with Jetbot, that uses an image classification. We trained the model under different lighting conditions and on several locations, including the expo location. When Jetbot is in an even numbered area and crosses a purple line its position increases by 1, when it crosses a yellow line it decreases by 1. Consequently, when Jetbot resides in an odd numbered area it is the other way around. Every three seconds Jetbot checks the textfile to know about the position of the fish in the tank, and adapts its behavior to follow the fish.

Fish detection

As one can see in the second video on this page, all fish are detected (which is shown in the video at the bottom). However, there is only one robot, so only one fish is really followed and marked with a blue rectangle. The position of this fish is mapped to the position of the robot. The upper part of the video shows the last detected fish position with a orange fish drawing. The position of the fish is only updated when the probability of the detection exceeds a certain threshold to prevent misdetections. The detection probability depends on multiple variables such as detected fish surface (minimum and maximum fish size), last detected fish position and time since last detection.

Some more technical aspects

For the fish detection, the video of the aquarium was analyzed for orange blobs. Other colors weren’t picked up. This is done so that hands of spectators and other moving objects wouldn’t be detected as fish. The detection was done by determining the pixel RGB differences between each current frame and the calibrated start frame (the aquarium background). With blob detection, possible fishes were marked as fish candidates (visualised with blue and purple rectangles). Because calibrating the background couldn’t be done without fish in the tank, an option was added to partially recalibrate the system by selecting the original fish positions manually. Communication with the robot was done with use of a localhost FTP server that updated the instructions file (that was read out by the robot every three seconds). Also a couple of functions to manually control the robot via Open Frameworks were added.

Problems encountered

Initially, our plan was to let Jetbot follow a single fish based on its exact position in the tank. However, we soon discovered that the physical properties of the underwater water world and the fish enabled them to move at a speed and virtuosity that our robot could not mirror in the terrene world. Moreover, given two coordinates for instance, we could determine exactly which way Jetbot had to go, but the power the robot put on each wheel was so inconsistent every time, we could not steer him accurately. Secondly, we encountered many technical problems and a shortage in online documentation during the data collection phase. Therefore, we opted for the final way of position mirroring, Jetbot now keeps track of its position by the use of the lines.

Lastly, we also discovered that Jet, the goldfish we bought at first, liked to have some company; two more fish ensured a better appetite and more playful activity. The decision to have three fish and let Jetbot follow them only approximately also increased the freedom of the robot. Instead of exactly mirroring the behavior of a goldfish, Jetbot could now follow them, but decided to stay in a certain area a little bit longer and to perform the given task at its own pace an manner. From the videos one can see how Jetbot sometimes makes an extra turn before going towards the area it is directed to.

Autonomy

In the weekend of the exposition we found that we created an autonomously moving robot. We found this autonomy interesting in the context of the parallels theme, since a fish is already considered as an autonomous creature, whereas most robots are not. The robot was able to learn and adapt, but also to explore its environment freely.

Imagine, what if..?

In our project, we only implemented one-way communication: Jetbot knows about the fish, but the fish don’t know about Jetbot. We thought of ways to enhance the communication and make it bidirectional, for example by showing the fish images that were recorded via Jetbots camera. What would happen if fish could have more information about the terrene world? What would happen to their brain, would they evolve differently? What if robots could learn from fish and vice versa? Could a transmission of cultures emerge from that? We find these inspiring questions for future research and learning.

The source code of our project can be found here.