Time is not invisible, it is a relentless accumulation of white dust, falling like sand from a broken hourglass and slowly filling the void of the universe. The Voyager is an organic, ethereal creature that consumes time: it drifts through the current, swallowing the white particles like a black hole. The sound represents human civilization--our attempts to communicate, to declare that we are here. The Voyager responds to these signals, yet it never stops feeding. Against deep time, human struggle becomes only a tiny ripple, brief and fragile. The Voyager is an interactive project created to deliver a message to future humanity: in the endless river of time, human beings are unimaginably small.
Design and Composition
The project originally began with an idea that was quite different from what it has become: a reimagining of what the Golden Voyager Record might include in the future. I wanted users to be able to engrave their own records by recording their voices or even capturing gestures and movements as a form of inscription. At the same time, the work invites reflection: if something we leave behind were to be received by an extraterrestrial intelligence, what kind of message could truly represent humanity and what would we choose to send?
The video recordings below document an early, highly sketched version of the concept. At the time, I was inspired by an irisgram program, so I experimented with adding visualizations that would make the interface feel more like a vinyl record player.
However, I soon realized that the two parts I designed were extremely demanding for users: without sufficient musical knowledge, it was difficult to produce sounds that felt coherent or pleasant. This limitation pushed me to rethink the project in a less demanding, more accessible direction. Around that time, I connected the idea to the virtual creature from Project A, which led me to a new question: could the message I wanted to convey be delivered through the user’s interaction with a living, creature-like entity instead of through complex sound-making skills?
I still wanted to preserve sound as the core interaction. So I needed to evolve the concept and adjust the message I was trying to communicate. Starting from the Golden Voyager Record, where humanity once chose sound as a vessel for civilization, I decided to keep that metaphor: audio as a condensed trace of who we are. In my reimagined version, the creature becomes an extraterrestrial being that receives human signals, listening to what we send into the unknown.
Influenced by the visual language of The Matrix, its cold green glow, cascading code, and the sense of an unseen system operating behind reality, my first direction for the creature was a matrix-like artificial intelligence. In this version, the creature would not simply hear sound, but compute it, treating human voices and signals as input streams to be decoded, classified, and absorbed.
Another idea was to design a virtual creature. While watching Oscilloscope Music visualizations, I became fascinated by oscillating wave patterns—when I imagined translating those patterns into 3D, the way they sink and fold inward felt almost like a mouth opening and closing. At the same time, the silhouette also reminded me of a jellyfish, which led to the creature’s final form: soft, flowing, and semi-organic, yet clearly generated by signal-like structures.
In the interaction, its “mouth” continuously absorbs small white particles. As the particles approach the mouth, they gradually slow down, as if being pulled into a stronger and stronger gravitational field. On the sound-interaction layer, the creature responds to the microphone input: based on the FFT analysis, it briefly shifts its color and the amplitude of its movement, rising and falling more intensely for a moment, before inevitably returning to its original state. This return is intentional: it represents the creature’s unstoppable feeding, and the user’s powerlessness in the face of it, echoing the idea that human effort is only a temporary disturbance against something vastly larger and indifferent.
Technical Process
On the left shows my overall skeleton of the code. I strongly referred the object-oriented programming and particle systems since they are very helpful when I want to duplicate an object for thousands of times while maintaining a manageable moving pattern.
In the draw function, these two lines of codes maps these values to visual properties. For example, the vol (volume) affects the speed of rotation and the amplitude of the waves, while the spectrum (frequency) affects the individual height and color of each box.
boxes[i][j] is a 2D array of Box objects.
Each Box object knows its own (x, y) position, its wave phase angle, and its current height z.
Each Box also stores freqLevel and the current color it should use (currentHue).
Inside the update function for the Box object,
The function first determines how fast and how high the box should move based on the overall loudness of the music:
waveSpeed: This is calculated by multiplying the base speed by the volume (vol). As a result, higher volume makes the grid oscillate faster.
this.angle: This is incremented by the waveSpeed to progress the sine wave animation over time.
waveAmplitude: This is determined by the scale (scl) and volume. Louder sound input creates taller, more dramatic peaks in the wave.
Beyond overall volume, specific sound frequencies (freqLevel) provide visual adjustments:
freqBoost: This maps the frequency level (ranging from 0 to 255) to a height offset (0 to 30), allowing individual boxes to oscillate higher in response to specific sounds.
hueShift: This shifts the box's color along the HSB spectrum based on the frequency, causing the grid to change colors in sync with the input sound's pitch.
The final z position of the box is calculated using a sine wave formula.
stars[] is an array of Star objects.
Each Star object is a moving point in 3D with position (x,y,z) and velocity (vx,vy,vz).
Instead of picking a random X, Y, and Z, I choose to sample Perlin noise based on a shifting time variable t. These noise values are mapped to angles and then processed through sine and cosine functions to project each star into its 3D position. After such method, the placement of the stars will be more organized than random selection method.
For the movement logic, it is highly similar to the box object. One special modification is that to emphasize the concept of obsorbing, I calculate the distance between current position and the center of the canvas, and remove this star from the array since I have utilized an index i to track the star's position in the array.
Reflection and Future Development
The project underwent a significant conceptual shift during the development. Originally envisioned as a futuristic Golden Voyager Record, the goal was to allow users to "engrave" their own voices and gestures into a permanent record for extraterrestrial life. However, early testing reflected that creating coherent sound was too demanding for users without musical training. To make the experience more accessible, I changed the focus of the interaction from "human as the record-maker" to "human as a signal source" for a extraterrestrial entity.
As for the result that I have achieved, I am satisified with the visualization of the imagined creature as well as the coherance of the movement of the stars and the boxes together with the sound inputs. Also I tried to add more storytellings through strong metaphors. Regarding future development, I have proposed three aspects:
More design towards the concept of time decay.
More visualization on humans’ sound.
A change in camera to amplify the creature’s colossal scale, reflecting the viewer's sense of insignificance.
Some feedbacks from interactions as well as final presentation are also helpful:
The core concept is too abstract, so explicit on-screen guidance and textual descriptions should be considered to clarify my idea.
Put some focus on designing the HTML page, so that when interacting with the project the user can be more immersive.
Reference:
A p5js sketch for irisgram: https://editor.p5js.org/tinyturingmachine/sketches/GLOvLhqRC
Music Visualization with FFT: https://www.youtube.com/watch?v=8O5aCwdopLo&t=2258s
A sketch for oscillating wave pattern: https://editor.p5js.org/pattvira/sketches/MrusyEiBe
Oscilloscope Music: https://www.youtube.com/watch?v=R9jOWIhZZCE
Helps from Xiao, Gohai and all the feedbacks from testers!