Link To the Project: Dream-Self-Portrait
Link To the Original Self-Portrait
This is a mini project making my partner Mike's self-portrait an interactive version. The original sketch is on the left.
The video on the left is an interactive version of Mike's self-portrait. He described his dream in an orange-yellow vibe with a butterfly, which catches his attention, flying around. I think this dream can be interpreted using two main elements: the background and a butterfly.
I first added a butterfly on the top of the original self-portrait with transparency adjustment to make sure the original scene can still be seen. To create the movement of "flying", I made the butterfly's wings flap automatically. With my mouse clicking on other parts of the canvas, the butterfly heads for the specific direction accordingly. Mike's eyes follow the movement of the butterfly as well. For the warm dusk vibe, I used the change of the background color. It is mapped from red to yellow according to the butterfly's height, which is controlled by the mouse as well.
Mike said he liked the interactive version of his self-portrait, and it elaborated on what he meant. He especially likes it when a butterfly flies between his eyes and his self-portrait becomes cross-eyed.
Below is the code of the interactive self-portrait. Most of the face and body components are from Mike's original code. My adjustment is explained with comments.
// Butterfly
let bx = 320;
let by = 180;
let vx = 0; // velocity x
let vy = 0; // velocity y
let targetX = bx; //target position is where my mouse clicks
let targetY = by;
function setup() {
createCanvas(400, 400);
rectMode(CENTER);
noStroke();
}
function draw() {
// flap controls the automatic butterfly-wing flapping with frameCount
// jitter is used to make the movement of the butterfly more natural
// instead of moving straight line using the random function
let flap = sin(frameCount * 0.2) * 2;
let jitter = 2;
// To make sure jitter would not prevent the butterfly from stopping
// if condition has a buffer of 5
if (abs (bx - targetX) > 5){
bx += (targetX - bx) * 0.05;
bx += random(-jitter, jitter);
}
if (abs (by - targetY) > 5){
by += (targetY - by) * 0.05;
by += random(-jitter, jitter);
}
// background is controlled by butterfly's height
background(255, by/height * 255, 80);
// Shirt
fill(0);
ellipse(width / 2, height, 400, 150);
// Hair
fill(0, 0, 0);
arc(width / 2, height / 2 - 70, 260, 250, PI, TWO_PI, CHORD);
rectMode(CORNER);
rect(70, height / 2 - 70, 20, 55);
rect(330 - 20, height / 2 - 70, 20, 55);
rectMode(CENTER);
// Ears
fill(253, 229, 202);
ellipse(width / 2 - 120, height / 2, 40, 75);
ellipse(width / 2 + 120, height / 2, 40, 75);
// Face
fill(253, 229, 202);
ellipse(width / 2, height / 2, 250, 300);
// // Eyes
// fill(255, 255, 255);
// ellipse(120, height / 2, 30);
// ellipse(270, height / 2, 30);
// fill(0, 0, 0);
// ellipse(115, height / 2, 20);
// ellipse(265, height / 2, 20);
// Eyes
fill(255);
ellipse(120, height / 2, 30); // left eye white
ellipse(270, height / 2, 30); // right eye white
// Pupils follow butterfly
let eyeRadius = 15; // half of white ellipse (30/2)
let pupilRadius = 6; // how far pupil can move inside
let pupilSize = 20; // pupil diameter
// left eye
let dxL = bx - 120;
let dyL = by - height / 2;
let distL = sqrt(dxL*dxL + dyL*dyL);
if (distL > 0) {
dxL = dxL / distL * pupilRadius;
dyL = dyL / distL * pupilRadius;
}
fill(0);
ellipse(120 + dxL, height / 2 + dyL, pupilSize);
// right eye
let dxR = bx - 270;
let dyR = by - height / 2;
let distR = sqrt(dxR*dxR + dyR*dyR);
if (distR > 0) {
dxR = dxR / distR * pupilRadius;
dyR = dyR / distR * pupilRadius;
}
ellipse(270 + dxR, height / 2 + dyR, pupilSize);
// Nose
stroke(255, 200, 188);
strokeWeight(10);
line(width / 2, height / 2 + 30, width / 2 + 15, height / 2 + 30);
noStroke();
// Mouth
stroke(0);
strokeWeight(3);
noFill();
arc(width / 2, height / 2 + 80, 50, 20, 0, PI / 2);
noStroke();
// Eye brow
fill(0, 0, 0);
beginShape();
vertex(105, 131);
vertex(160, 154);
vertex(154, 168);
vertex(99, 145);
endShape(CLOSE);
fill(0);
beginShape();
vertex(229, 154);
vertex(284, 131);
vertex(290, 145);
vertex(235, 168);
endShape(CLOSE);
// Butterfly
// wings
fill(255, 150, 150, 200); // pink
ellipse(bx - 10 - flap, by - 10, 25, 20); // left up
ellipse(bx - 10 - flap, by + 10, 25, 20); // left down
fill(255, 102, 0, 200); // orange
ellipse(bx + 10 + flap, by - 10, 25, 20); // right up
ellipse(bx + 10 + flap, by + 10, 25, 20); // right down
// body
fill(80, 40, 20);
rectMode(CENTER);
rect(bx, by, 6, 25, 3);
// pattern
fill(255, 255, 255, 180);
ellipse(bx - 10, by - 10, 8, 8);
ellipse(bx + 10, by + 10, 8, 8);
}
// specify target position with mouse clicking
function mousePressed(){
targetX = mouseX;
targetY = mouseY;
}
It should be specified writing the mathematical computing part:
// left eye
let dxL = bx - 120;
let dyL = by - height / 2;
let distL = sqrt(dxL*dxL + dyL*dyL);
if (distL > 0) {
dxL = dxL / distL * pupilRadius;
dyL = dyL / distL * pupilRadius;
}
fill(0);
ellipse(120 + dxL, height / 2 + dyL, pupilSize);
// right eye
let dxR = bx - 270;
let dyR = by - height / 2;
let distR = sqrt(dxR*dxR + dyR*dyR);
if (distR > 0) {
dxR = dxR / distR * pupilRadius;
dyR = dyR / distR * pupilRadius;
}
ellipse(270 + dxR, height / 2 + dyR, pupilSize);
is actually assisted by ChatGPT for the specific math formula to make the geometric movement smooth.
During this minilab, I learned about the function of mousePressed(). It allows the action of clicking the mouse to make more difference instead of only a state of mouseIsPressed.
The use of my partner's language for the scene is descriptive, creating an atmosphere or ambience that needs to be imagined. I pretended that I was in a similar environment and thought about what I would discover and pay attention to. I need to specify a particular element that I need to include and build by code. Moreover, the real-life description and interaction might be way more complicated than the coded version, such as a gradient sky or some physical movement. I found programs with these details challenging due to the time limit.
We need to use multi-sensuous methods, such as imagination, vision, and touch, for a complete real-life interaction experience. By contrast, digital interactions are more like giving the computer instructions and translating our physical signals into electronic ones without real-world sensory richness. Another possible interactive device I can imagine is the Heartbeat Pen. It can transform emotional and physiological states into digital art. Instead of relying on traditional mouse or keyboard input, it captures signals from the body itself. Different shapes and colors will appear with different data detection.