Video Documentation
Conception and Design
The project, an interactive glove symbolizing infinite power, is inspired by the Infinity Gauntlet in the Marvel movie. However, unlike the Thano’s Gauntlet is designed to destroy, we leave a profound dilemma to users, “If you have infinite power, would you like to destroy or create?”.
Our most crucial design choice was to design interactive actions for both creation and destruction modes. For the creation mode,we originally intended to utilize light strips and sound effects to create the illusion of a river, lift our hands to make trees grow, and slightly bend our fingers to produce life like butterflies and birds. In terms of the destroy mode, we mostly pounded our fists to produce lightning and demon noises. The relationship between the two modes is parallel, which means there is no chronological sequence for creation and destruction.They can intersect at will, which also reflects the interlaced and alternating relationship between the two. Besides, the final physical form of our work is a gauntlet incorporating 3D modeling and knitted gloves, as well as a projection screen. So the user can easily understand that the first step is to put on the gloves and observe the effect on the screen.
During the user test, we found that users would feel a little confused after wearing gloves and did not know what to do with the gloves. Therefore, we recreated cardboard and simply wrote the main interactive actions on it as instruction. This turns out to be very helpful for users to know how to interact with the gauntlet as evidenced by our observation during IMA show. In addition, we also found that our flex sensor was not stable enough to produce the expected effects of creating different butterflies and birds at the expected time, so after trying to change the threshold value to no avail, we chose to change it to a pressure sensor, whose value is more accurate and stable.
Fabrication and Production
Overall Idea: The Arduino sensor serves as the primary input end in our work, while the processing's visual and aural effects serve as the output end. Prior to choosing sensors with the appropriate functionality, we first develop the interactive behaviors we hope to accomplish, like snapping, hand lifting, fingers bending and so on.
Arduino
Muscle Sensor:
We initially attempted to use an EMG muscle sensor to detect arm muscle contractions caused by finger snapping. However, the sensor's value changes were too small to reliably capture the snapping motion. Additionally, muscle movements unrelated to snapping often triggered false detections. The sensor also required adhesive attachment to the arm, which proved impractical for repeated use as the adhesive would lose its stickiness. We then tried a different sEMG sensor secured with velcro, but it still produced insufficiently significant value changes. Ultimately, we abandoned the snapping interaction and decided not to use the muscle sensor.
From ADXL345 Accelerometer to MPU-6050 Accelerometer:
We considered utilizing an acceleration sensor since we wish to identify fist punching and hand lifting motions. At first, we employed the ADXL345, which is limited to detecting acceleration. However, if the movements of punching and raising a hand are detected only by the addition of acceleration in three directions, the detection of the two movements will easily conflict. We therefore moved to the MPU-6050 accelerometer, which has the ability to measure acceleration and angular velocity on all three xyz axes at the same time. We utilize the angular velocity on the y-axis and the acceleration on the z-axis to detect the hand-raising action, and we use the acceleration on the x-axis and the y-axis to identify the fist-pumping motion that takes place on the horizontal plane. The two activities' detection is combined into a single sensor in this manner.
ADXL345 Accelerometer
MPU-6050 Accelerometer
From Flex sensor to Pressure sensor:
Our original goal was to make butterflies, birds, and other creatures appear on the screen by bending our fingers. However, the flex sensor we used proved unstable, as its values fluctuated even without being touched. To address this, we switched to a pressure sensor and replaced the finger-bending motion with a two-finger pinch.
Pinch
Punch
Neopixels:
Originally, we planned to integrate neopixels for dynamic light effects to enhance the overall visual experience and also extend the effects of creation and destruction into the physical environment. For example, my design is that when the flex sensor (later replaced by a pressure sensor) is bent, the neopixel lights will light up one by one, simulating the effect of flowing water in a river. My teammate Yanny designed the lightning effect for Destruction to match the filters on the screen. However, incorporating Neopixels code caused significant delay in data transfer between Arduino and Processing, confusing the order of other existing effects. Due to these facts, we finally exclude neopixels.
Initial design for Creation Neopixels
Initial code for Destruction Neopixels
2. Processing
We used Arduino to Processing's Serial communication, and used changes in the Arduino sensor to control the video, audio and picture effects on the processing.
Integration:
The biggest challenge was integrating the codes for multiple effects in the creation and destruction modes without conflicts or confusion. We used millis( ) to track start and end times, ensuring independent timelines operated without interference. Each effect, such as video playback and butterfly animations, was coded and annotated separately, making debugging and future adjustments more efficient. To enhance the overall experience, we implemented a state system with a keypress to provide a clear and cohesive starting point.
Butterfly and Birds:
The technique to making the butterflies appear to fly on the screen is setting the size and position of the PNG images with random function and drawing the background video in each loop. This ensures that previous butterfly images are covered, creating the illusion of motion.
Lightning:
The lightning effect is achieved through multiple filter transitions using a combination of filters and counting. When a punch is detected, the count starts at zero, triggering a sequence of filter changes before resetting to zero, ready for the next punch.
3. Fabrication
Our gloves are made out of a 3D printed gauntlet shell and a knitted gauntlet. We created a box for the Arduino on the hand's back using laser cutting. To symbolize destruction and creation, the gloves are printed in white, polished with matte paper, and then painted in either red or dark blue. We did not use sandpaper at the beginning and painted directly with acrylic paint. The paint will easily fall off after it dries.
Conclusion
This project creates an immersive experience for users to explore the transformative power of creation and destruction, provoking reflection on humanity's relationship with power and responsibility. Many users commented that the dual experience of creating life (e.g., growing trees) and invoking destruction (e.g., summoning lightning) vividly demonstrated the intricate relationship between creation and destruction. The interaction largely worked as intended, with minor variations, such as users raising their hands too quickly and triggering destruction instead of creation. Interaction is defined as ‘a dynamic process where subjects exchange and process information, leading to specific responses.’ In our design, user actions elicited distinct effects with immediate feedback, motivating further exploration. For instance, trees would stop growing after five seconds, prompting users to either persist or try alternative actions. To enhance interactivity, I would replace the current system with one that assigns effects to specific finger gestures, add neopixels lights to mimic flowing rivers, and adjust punch velocity to trigger varied destruction levels, from lightning strikes to burning trees to ash.
From this project, I learned two important lessons from success and one from failure. First, when working with multiple sensors and effects, testing each component individually before integration significantly improves success rates and debugging efficiency. Second, for complex coding logic, breaking it down into smaller parts, testing them separately, and saving every version of the code while summarizing resolved and unresolved issues greatly enhances coding efficiency. From failure, I learned to embrace trial and error as an integral part of the process. Spending hours resolving a single issue or experimenting with a sensor all afternoon, only to abandon it, taught me to view experimentation as a valuable learning experience rather than merely a means to an end.
Disassembly
Appendix
TreeSound
https://youtu.be/HkHL0DIGOfs?si=m32IKLF7R9uQl6a-
Evil Laugh
https://youtu.be/0SQazmrMKqU?si=l4NrZhLwrE2BH_aA
TreeMovie
https://youtu.be/oc3sKCGO-H0?si=b4N2wLcQESEy2LeU
Gauntlet
https://www.thingiverse.com/thing:2505737
Butterfly and Birds
Bubble sound effect
Circuit Diagram
Arduino Coding
#include <Adafruit_MPU6050.h>
#include <Adafruit_Sensor.h>
#include <Wire.h>
Adafruit_MPU6050 mpu;
//MPU 6050 sensor
float ax, ay, az, gy;
const float thresholdPunch = 5.0; // Threshold for XY acceleration magnitude (punch detection)
// const float thresholdLiftAccelZ = 2.5; // Threshold for upward acceleration (hand lifting)
const float thresholdLiftGyroY = 1; // Threshold for gyroscope Y (hand lifting)
const int sustainCountThreshold = 3; // Number of consistent readings to confirm lift
float preXY = 0.0; // Previous XY magnitude
int liftCounter = 0; // Counter for sustained lifting motion
unsigned long previousTime = 0;
const unsigned long interval = 20; // Sampling interval in ms
//force sensors
int forcePin1 = A0;
int forceValue1 = 0;
int threshold1 = 800;
int val;// whether the values of the force sensor are above the thresholds
int preVal = LOW;
int count = 0;// count the times of clenching
bool pinch = 0;
int punchDetected = 0;
//Neopixels
#include <FastLED.h>
#define NUM_LEDS 60 // How many LEDs in your strip?
#define DATA_PIN 3 // Which pin is connected to the strip's DIN?
CRGB leds[NUM_LEDS];
// int next_led = 0; // 0..NUM_LEDS-1
// byte next_col = 0; // 0..2
// byte next_val[3]; // temporary storage for next HSV value
void setup() {
Serial.begin(115200);
while (!Serial) {
delay(10);
}
if (!mpu.begin()) {
Serial.println("Failed to find MPU6050 chip");
while (1) {
delay(10);
}
}
mpu.setAccelerometerRange(MPU6050_RANGE_16_G);
mpu.setGyroRange(MPU6050_RANGE_250_DEG);
mpu.setFilterBandwidth(MPU6050_BAND_21_HZ);
Serial.println("MPU6050 initialized.");
delay(100);
//Neopixels
FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS);
FastLED.setBrightness(5); // external 5V needed for full brightness
leds[0] = CRGB::Red;
FastLED.show();
delay(3000);
leds[0] = CRGB::Black;
FastLED.show();
}
void loop() {
//forcesensors
forceValue1 = analogRead(forcePin1);
if (forceValue1 > threshold1) {//only use one force sensor now
val = HIGH;
pinch = 1;
}
else {
val = LOW;
pinch = 0;
}
if (preVal == LOW && val == HIGH) {//when you clench, it counts once
count++;
}
preVal = val;
delay(50);
//if (millis() - lastTime > interval2) {
//// lastTime = millis(); //replace delay with millis
// MPU 6050 sensor
unsigned long currentTime = millis();
if(currentTime - previousTime >= interval) {
previousTime = currentTime;
sensors_event_t a, g, temp;
mpu.getEvent(&a, &g, &temp);
ax = a.acceleration.x;
ay = a.acceleration.y;
az = a.acceleration.z; // Include Z-axis acceleration
gy = g.gyro.y;
// Detect sustained hand lifting
int liftDetected = 0;
if (az > -5 && az < 5 && gy > thresholdLiftGyroY) {
liftCounter++;
if (liftCounter >= sustainCountThreshold) {
liftDetected = 1; // Hand lifted
liftCounter = sustainCountThreshold; // Prevent overflow
}
} else {
liftCounter = 0; // Reset counter if conditions not met
}
// Calculate XY-plane acceleration magnitude
float xyMagnitude = sqrt(ax * ax + ay * ay);
// Detect punch
if (abs(xyMagnitude - preXY) > thresholdPunch) {
punchDetected = 1; // Punch detected
}else{
punchDetected = 0;
}
preXY = xyMagnitude;
// MPU6050
Serial.print(liftDetected); // arduino_values[0] = lift detection
Serial.print(",");
Serial.print(punchDetected); // arduino_values[1] = punch detection
Serial.print(",");
//force sensors
Serial.print(pinch);
Serial.print(",");
Serial.print(count);
Serial.println();
}
Processing Code
import processing.serial.*;
import processing.video.*;
import processing.sound.*;
Serial serialPort;
int NUM_OF_VALUES_FROM_ARDUINO = 4; // Number of values sent from Arduino
int arduino_values[] = new int[NUM_OF_VALUES_FROM_ARDUINO]; // Array to store Arduino data
int state = 0;
Movie myMovie0;//introduction video
Movie myMovie;
SoundFile TreeSound;
SoundFile Thundering;
SoundFile Butterfly;
//Timers
long introStartTime = 0;
long movieStartTime = 0;
long lightningStartTime = 0;
long ThunderingStartTime = 0;
//booleans
boolean introduction = false;
boolean isMoviePlaying = false;
boolean isTreeSoundPlaying = false;
boolean isThundering = false;
boolean lightningActive = false;
boolean butterflyFlying = false;
//Lightning stuff
int flashInterval = 5; // Interval between lightning flashes (ms)
int maxFlashes = 10; // Total number of flashes per lightning event
int flashCount = 0; // Current flash count
//Butterfly stuff
int bend;
int count;
int number;//the numbers of different pictures
PImage butterflyImage1;
PImage butterflyImage2;
PImage birdsImage3;
void setup() {
size(1600, 900);
printArray(Serial.list());
serialPort = new Serial(this, "COM15", 115200);
//Files
myMovie0 = new Movie(this, "introduction.mov");
myMovie = new Movie(this, "TreeNoSound.mov");
TreeSound = new SoundFile(this, "TreeSound.MP3");
Thundering = new SoundFile(this, "Lightning_and_EvilLaughing.MP3");
Butterfly = new SoundFile(this, "bubble.wav");
butterflyImage1 = loadImage("butterfly1.png");
butterflyImage2 = loadImage("butterfly2.png");
birdsImage3 = loadImage("birds.png");
}
void draw() {
getSerialData();
if (state == 1) {
drawState1();
}
if (state == 2) {
drawState2();
}
if (state == 0) {
drawState0();
}
//println(state);
//call the state function
//println(lightningActive);
bend = arduino_values[2];
count = arduino_values[3];
number = arduino_values[3]%3;//the numbers of different pictures
}
void drawState0() {
fill(0);
}
void drawState1() {
//if (introduction == true) {
// introduction = false;
if (myMovie0.available()) {
myMovie0.read();
}
image(myMovie0, 0, 0, width, height);
myMovie0.play();
if (millis() - introStartTime > myMovie0.duration() *1000) {
myMovie0.stop();
// println("yes");
state = 2;
}
}
void drawState2() {
// Render video frame
if (myMovie.available()) {
myMovie.read();
}
image(myMovie, 0, 0, width, height);
//**Handlifting part
if (arduino_values[0] == 1) {//Handlifting is detected
playTreeMovie();
movieStartTime = millis();
if (!isThundering) {
playTreeSound();
} else {//isThundering = true
isThundering = false;
Thundering.pause();
playTreeSound();
}
}
//Stop the movie 4s after the handlifting
if (millis() - movieStartTime > 4000) {
myMovie.pause();
isMoviePlaying = false;
}
//**Punching part
if (arduino_values[1] == 1) {
pauseTreeMovie();
pauseTreeSound();
playThundering();
ActivateLightning();
if (isThundering) {
isThundering = false;
playThundering();
}
}
//**Restart
if (isThundering && millis()- ThunderingStartTime > Thundering.duration()*1000) {//convert the duration into ms
Thundering.stop();
playTreeSound();
isThundering = false;
}
if (bend == 1) { //does not include variable y yet
butterflyFlying = true;
println( butterflyFlying);
} else {
butterflyFlying = false;
}
//Butterfly1
if (butterflyFlying == true) {
if (count%3 == 0) {
image(butterflyImage1, random(1, 1200), random(0, 400), random(100, 150), random(100, 150));
delay(100);
}
if (count%3 == 1) {
image(butterflyImage2, random(1, 1200), random(0, 400), random(100, 150), random(100, 150));
delay(100);
}
if (count%3 == 2) {
image(birdsImage3, random(1, 1200), random(0, 400), random(100, 150), random(100, 150));
delay(100);
}
falseButterfly();
}
}
void falseButterfly() {
if (butterflyFlying == true && bend == 0) {
butterflyFlying = false;
}
}
void playTreeMovie() {
if (!isMoviePlaying) {
myMovie.play();
isMoviePlaying = true;
}
}
void pauseTreeMovie() {
if (isMoviePlaying) {
myMovie.pause();
isMoviePlaying = false;
}
}
void playTreeSound() {
if (!isTreeSoundPlaying) {
TreeSound.play();
isTreeSoundPlaying = true;
}
}
void pauseTreeSound() {
if (isTreeSoundPlaying) {
TreeSound.pause();
isTreeSoundPlaying = false;
}
}
void playThundering() {
if (!isThundering) {
ThunderingStartTime = millis();
Thundering.loop();
isThundering = true;
}
}
void ActivateLightning() {
lightningActive = true;
lightningStartTime = millis(); // Record lightning start time
flashCount = 0; // Reset flash count
if (lightningActive) {
if (flashCount < maxFlashes) { // Perform flashes
if (((millis() - lightningStartTime) / flashInterval) % 2 == 0) {
filter(INVERT); // Apply lightning effect
} else {
image(myMovie, 0, 0, width, height); // Remove effect by redrawing video
}
// Increment flash count after each interval
if (millis() - lightningStartTime > flashInterval * (flashCount + 1)) {
flashCount++;
}
} else { // End lightning effect
lightningActive = false;
}
}
}
void getSerialData() {
String in = serialPort.readStringUntil( 10 ); // 10 = '\n' Linefeed in ASCII
if (in != null) {
print("From Arduino: " + in);
String[] serialInArray = split(trim(in), ",");
if (serialInArray.length == NUM_OF_VALUES_FROM_ARDUINO) {
for (int i=0; i<serialInArray.length; i++) {
arduino_values[i] = int(serialInArray[i]);
if (arduino_values[2] == 1 && !Butterfly.isPlaying()) {
Butterfly.loop();
} else if (arduino_values[2] == 0 && Butterfly.isPlaying()) {
Butterfly.stop();
}
}
}
}
}
void keyPressed() {
if (key == 'k') {
state = 1;
introStartTime = millis();
}
}
Acknowledgements
Special thanks to my partner Yanny for bringing this Infinity Gauntlet project to life together. I would also like to express my gratitude to Professor Andy for his invaluable guidance in polishing the concept, selecting sensors, and addressing technical challenges.