My Partner: Jiaxi (Amelia) Zhang
At left is her original portrait.
Since her portrait features a crying face, she recounted a frustrating experience where a small bug kept flying in front of her. She tried to swat it away but accidentally slapped herself, and cried because it was so painful.
My p5.js project brings a moment of comical frustration to life. A red dot symbolizes the hand, allowing the user to simulate a slap by clicking and holding the mouse. The intensity of the character's crying directly correlates with the duration of this "slap," visualized by the accelerating speed of the falling tear blocks. For a more dynamic and engaging experience, the mouth continuously tracks the cursor's position. Upon impact, the character's expression switches from a smile to a frown, capturing the sudden, painful surprise of the moment.
Based on my partner's experience, the feedback was largely positive, with the interactive mouth that follows the cursor being highlighted as a particularly fun and engaging feature. The main area for improvement was the red dot, which was perceived as too abstract and disconnected from the narrative. The suggestion was to enhance the storytelling by replacing the dot with a more specific visual, such as an illustration of the small bug that initially caused the incident.
During the coding process for this project, I encountered two main challenges. First, determining the appropriate conditional logic led to multiple nested if statements. This nesting reduced the code's readability and made debugging more difficult, as it was challenging to pinpoint the exact location of bugs within the complex structure. Second, I initially found it hard to anticipate the number of variables needed. This resulted in frequently hard-coding the same numerical value multiple times, only to later refactor the code by replacing these numbers with a single, unified variable for better maintainability.
Reflection Prompts
What have you learned throughout the process of using the setup and draw functions, as well as variables and conditionals?
Throughout this process, I learned how fundamental variables and conditionals are to creating dynamic code. Variables proved essential for convenience and readability; by assigning a descriptive name to a value, I could easily reuse it throughout my code and immediately understand its purpose, which greatly improves maintainability. Conditionals, on the other hand, unlocked a new layer of interactivity. In my project, their primary role was to manage state by switching between different visuals based on whether the mouse was pressed. This kind of dynamic, responsive behavior would be impossible to achieve without conditional logic.
Reflect on the language your partner used to describe the memory/dream and your interpretation of it using code.
My interpretation deconstruct her discription into some key factors: the action (the slap), the consequence (the crying), and some abstractions. I translated the physical act of "slapping" into a direct user interaction: clicking and holding the mouse. This makes the user the agent of the action, not just a viewer. The simple statement "she cried" was interpreted as the falling "tear blocks", visualizing the crying, turning an emotional result into a tangible, visible output. I chose to represent the "hand" as an abstract red dot and "tears" as blocks. This stylization moves the interpretation away from a literal recreation and towards a more symbolic and game-like experience.
What are your thoughts on digital interactions and real-world interactions? After using mouse and keyboard interaction, describe an interactive device — one that doesn’t exist yet — that could let you to recreate better the memory/dream in p5.js
Digital interactions are symbolic abstractions of real-world actions, while real-world interactions are multi-sensory and embodied. Using a mouse and keyboard is a translation of intent into a limited set of inputs (clicks and key presses), while real life involves a mix of tactile feedback, spatial awareness, and nuanced physical motion. For this dream, the back and forth sound from the bug as well as the itch on the face can hardly be expressed through the digital interaction. An ideal device could translate digital triggers from a p5.js sketch into localized, physical sensations on the user's face and head, to better transmit those two senses to the user.
let tear_y;
let tearVelocity;
function setup() {
createCanvas(400, 400);
tear_y = 100 + 30 * 2 + 15;
tearVelocity = 1;
}
function draw() {
noStroke();
background(255, 200, 199);
if (mouseX > width / 2){
toneX = min(width / 2 - 20 + mouseX * 0.1,width / 2 + 15);
}
else{
toneX = max(width / 2 - 20 + mouseX * 0.1,width / 2 - 15);
}
// eyes
fill(153, 51, 51);
ellipse(100, 100, 80, 15);
ellipse(width - 100, 100, 80, 15);
rectMode(CENTER);
// mouth
fill(194, 71, 71);
rect(width / 2, height / 2 - 100 / 2, 4 * 30, 1.5 * 30);
if (mouseIsPressed) {
// tone
fill(255, 170, 199);
rect(toneX, height / 2 - 1.2 * 30, 3 * 30, 0.6 * 30);
fill(255, 0, 0);
circle(mouseX, mouseY, 20);
fill(153, 255, 255); // tear blue
rect(100, tear_y, 30 * 1.5, 30 * 4.2); // left tear
rect(width - 100, tear_y, 30 * 1.5, 30 * 4.2); // right tear
tearVelocity += 0.2;
tear_y += tearVelocity;
if (tear_y > height) {
tear_y = 100 + 30 * 2 + 15;
}
} else {
tear_y = 100 + 30 * 2 + 15;
tearVelocity = 1;
fill(255, 170, 199);
rect(toneX, height / 2 - 1.2 * 54, 3 * 30, 0.6 * 30);
}
}