In my project #5525, I created an interactive music game interaction system with five clickable characters, each representing a member of the band Mayday, with their own exclusive audio and animation effects. Here are the main features of the project:
Interface and characters:
The game interface consists of five characters: Ashin (Chen Xinhong), Masa (Tsai Shengyan), Monster (Wen Shangyi), Ming (Liu Yanming), and Stone (Shih Jinghang), which are displayed in different positions on the screen. The characters will play the corresponding sound files when clicked and rotate at a certain rotation speed, making the interface dynamic.
Audio Interaction:
Different sound files (e.g. JL, LKDQBM, GL and FKSJ ) are used to enhance the sound and interactivity of the game by triggering the corresponding audio playback by clicking on the character. For example, clicking on Ashin's area will play the audio JL.m4a.
Visual Effects:
Characters rotate on the screen over time, with each character's rotation triggered by monitoring the position of the mouse. When the mouse is over the character, the character will rotate at a certain speed, providing an interactive and immersive effect.
Spectral Analysis:
Use p5.FFT for spectral analysis, which is used to generate frequency-based visual effects (such as spectral bars) and affect the vertical position of the character, making the character seem to “jump” or “float” during music playback.
Character customization:
Each character has its own unique colors and features, such as Ashin's pink circle, Masa's yellow theme, Monster's red, and so on.
Characters change colors and shapes to personalize the visual experience, enriching the game's visual presentation.
Additionally, I reflected deeply and identified a number of areas that could be enhanced with a view to delivering a better immersive experience for the audience.
First of all, I realized that the goal of the project needed to be clearer. Although I was designing an interactive experience based on the Mayday music atmosphere, I needed to clarify whether this was centered on the immersion of the music itself, or whether there was a greater emphasis on audience interaction with the content. I realized that with clearer goals for the project, my design and development direction would be clearer and better able to present the emotional resonance that Mayday brings to the listener.
I also saw room for improvement in user experience design. As Professor Moon said, it is important to experience the whole process from the user's perspective. I want to make sure that every step in the interaction is smooth and easy to understand for the audience. I plan to add some simple guidelines or visual cues to help viewers quickly understand how to operate, so that they can naturally integrate into the interactive experience without being distracted by unfamiliar operations. Such small details can effectively improve the user's sense of fluency in the experience.
In addition, I focused on the technical implementation of the project. I used libraries such as p5.js and hardware interactive controllers, but in practice, I occasionally encountered lag or delay. Such problems, once they occur during the presentation, diminish the immersion of the audience. Therefore, in the future, I will decide to optimize the code structure to ensure that the system can run stably and efficiently, and minimize the impact of technical obstacles on the audience experience.
Finally, I want to explore the possibility of multi-sensory interaction. In addition to the visual and auditory senses, I am considering adding haptic feedback, such as introducing vibration or lighting effects in appropriate contexts. This would provide the audience with a richer sensory experience, making them feel as if they were there during the interactive process, and further feeling the power and emotion brought by Mayday's music.