Sound to Light
Sound to Light
What I wanted to learn, and why it interested me: I wanted to explore how light can be used as an interactive medium, responding directly to environmental inputs like sound. I’ve always been curious about experiential lighting, how light can shape mood, perception, and engagement in a space.
Final outcome: The final setup was a LED strip light that reacts to sound, cycling through rainbow colors when audio input passes a threshold and returning to white when the space is quiet. The outcome demonstrates a system where environmental data, sound amplitude, directly informs a lighting pattern. The result is an engaging audio-to-visual experience, where each clap, voice, or noise creates a visible change in the color of the lights.
Images of final creative/exploratory output
LED strip demonstrating the programmed sequence, shifting from default white to rainbow colors as sound events are detected.
Real-time demonstration of the system: each clap triggers the LED strip to change color, translating sound into light.
Process images from development towards creative/exploratory output
Calibration stage, adjusting the microphone’s quiet baseline and refining the Arduino code to distinguish sound from background noise.
Early testing of sound-to-light mapping, where low thresholds caused the LEDs to change colors too quickly, showing the need for sensitivity tuning.
Process and reflection:
My process began with setting up the microphone sensor and learning how it outputs analog signals that could be read by the Arduino. From there, I focused on calibrating the input, since the microphone always has a “quiet” baseline that needs to be accounted for before detecting meaningful changes. Once the input was stable, I moved into programming logic that mapped sound events to light changes, designing the system so that a sound would shift the LEDs to a new rainbow color and then return to white after a short pause. This required several iterations of testing and debugging, as early versions either stayed stuck on white or cycled through the rainbow colors. Reflecting on the outcome, I realized that the technical side of calibrating signals was just as essential as the creative side of deciding how the lights should behave. It made me better appreciate how physical computing can turn raw environmental data into engaging sensory experiences.
Technical details
Block diagram showing the flow of signals, from sound waves entering the microphone, through Arduino processing, to the LED strip producing visible light.
Electrical schematic showing how the microphone sensor connects to the Arduino and how the Arduino controls the LED strip to create sound-responsive lighting.
/*
60-223 Intro to Physical Computing, fall 2025
Domain-specific Skill Building exercise: [Sound to Light]
[My code runs on an Arduino and connects to two main pieces of hardware: a microphone sensor and a strip of NeoPixel LEDs. The microphone picks up sound from the environment and sends an analog signal to the Arduino, which processes the signal and decides whether the sound is loud enough to count as an “event.” When a sound is detected, the Arduino tells the LED strip to change to the next color in a rainbow sequence. If no sound is present, the LEDs default to white. In simple terms, it turns sound into light by making the LEDs react whenever you clap, talk, or make noise nearby.]
Pin mapping:
Arduino pin | role | details
------------------------------
A1 input microphone
3 output LED strip
Released to the public domain by the author, September 2025
Sarah Fernandes, sarahfer@andrew.cmu.edu
*/
#include <Adafruit_NeoPixel.h>
#define LED_PIN 3
#define NUM_LEDS 15 // number of LEDs in strip
#define MIC_PIN A1
Adafruit_NeoPixel strip(NUM_LEDS, LED_PIN, NEO_GRB + NEO_KHZ800);
int colorIndex = 0;
int threshold = 49; // adjust based on mic sensitivity
unsigned long lastBeat = 0;
unsigned long beatDelay = 100; // ms to ignore false multiple triggers
// Rainbow colors
uint32_t rainbowColors[] = {
strip.Color(255, 0, 0), // Red
strip.Color(255, 165, 0), // Orange
strip.Color(255, 255, 0), // Yellow
strip.Color(0, 255, 0), // Green
strip.Color(0, 0, 255), // Blue
strip.Color(75, 0, 130), // Indigo
strip.Color(238, 130, 238) // Violet
};
int numColors = sizeof(rainbowColors) / sizeof(rainbowColors[0]);
void setup() {
strip.begin();
strip.show(); // initialize all LEDs off
Serial.begin(9600);
}
void loop() {
int micValue = analogRead(MIC_PIN);
int baseline = 512; // typical center, found through chatGPT
int amplitude = abs(micValue - baseline); // distance from center
Serial.println(amplitude);
if (amplitude > threshold && millis() - lastBeat > beatDelay) {
// Sound detected → next rainbow color
colorIndex = (colorIndex + 1) % numColors;
showColor(rainbowColors[colorIndex]);
lastBeat = millis();
} else if (amplitude <= threshold) {
// Quiet → white
showColor(strip.Color(255, 255, 255));
}
}
void showColor(uint32_t color) {
for (int i = 0; i < NUM_LEDS; i++) {
strip.setPixelColor(i, color);
}
strip.show();
}