Waiter Wick
Sarah Blosser
Instructor: Viola He
Our initial idea conception came from something that we both enjoyed thoroughly which is eating out. Another point we had in mind when brainstorming is, instead of making something like a game with various functionalities, we wanted to create something that addressed a pain point in which our artifact is a solution. The concept of our specific pain point, is the interaction between the waiter and the diner, and the way in which the diner either gets interrupted mid-conversation or can't get the attention of the waiter. We wanted to create a way that diners could signal to not only waiters, but others around them the status of their dining experience. The diners are able to set the timeline of their meal, in a way that works with their flow of conversation. An inspiration for the waiter indication device was the existing call bells that some restaurants have on the tables that play a sound when pressed to get the attention of the waiter. We wanted to build on this interaction, but refine it in a way that didn't disrupt the environment of the restaurant with a loud sound. From this, we thought a light indication would be more seamless and not disruptive. All in all, we wanted to create an artifact that enhances not only the interact between the diners and the artifact, but between the wait staff and the artifact, as well as the waiters and the diners.
We knew that we wanted our project to have two signals, in a sense that it called the waiter over, and also told the waiter through the artifact that the diners were mid-conversation. We didn't know if we wanted both of those interactions to be manually set by the diners themselves, or if just one or the other would be manually set by the diners. If it wasn't set by the diners themselves, we needed to think of another way to show that the waiter should come or stay away. With these thoughts, we decided that we would have the diners manually call the waiter over, and the the indication of if the waiter should stay away would be set by the sound levels of the table. We chose the sensor accordingly, choosing a touch sensor to call the waiter over, and a microphone sensor to gauge the coversation sound levels of those dining. Both of those sensors, we decided would turn on an LEDs of different colors that symbolized different scenarios. In addition to this, our original conception of the idea included another element too, in which when the waiter came over to the table to serve the food, the candle which we decided would be the shape of our artifact would move to make space for the served food. We wanted to try and incorporate some of the linear motion we learned in the recitation using a stepper motor, and the distance sensor that we had also used previously in class. In terms of the physical construction of the artifact, we knew that it would be made of cardboard, but we also wanted to incorporate more elements we had at our disposal, for example we put hot glue around the sides of the candle to depict melted wax, or we added a vase element with real flowers to add to the scene of a dining table.
Starting from the beginning of our process, we had only decided on a candle as our artifact and nothing else. With this framework, the microphone sensor as well as the touch sensor we wanted to incorporate would be outside the artifact. The only thing that would be imbedded in our candle artifact at this early stage of our conception was the LEDs of which would light up green to signal the waiter to come, and red when the waiter should stay away and no interrupt the meal. This would mean that we didn't have any ideas for where to put the sensors, both the touch and microphone sensor. This is where the User Testing process was highly helpful for our process in that our peers and instructors gave us ideas for how to incorporate these sensors and how to expand the artifact past just the candle itself. With these expansionary ideas, we decided to make a more comprehensive dining artifact, inclusive of a table, vase, menu, and the candle. We decided to imbed the microphone sensor in the vase, which we wanted to have real flowers in, and imbed the touch sensor into the menu. In addition to these suggestions, we also want to make something that made the artifact more intuitive to use, so on the table we decided to include a legend detailing what the lights mean. Depicted below is a rough sketch of how we wanted the end design to look like.
Moving to the process of actually building the circuit, writing the code, etc., my partner and I relativley even split these responsibilities. I was mainly responsible for writing the code, and my partner was mainly responsible for building the circuit. When starting, as mentioned before, our initial idea was to include a stepper motor and a distance sensor in the circuit. While I was writing the code for the circuit, I knew that I wanted it to include the state functions, and so after the circuit was built, I tested each component separately in different sketches to confirm the functionality, and then incorporated them in the state functions. But when I was testing the distance sensor and stepper motor, I encountered problems with the motor specifically making it go one rotation and back, without continuing. After trying for a couple days to figure out the code, time constraints caused us to reconfigure and just incorporate the two sensors and LEDs. Below is the sketch we ended up with, inclusive of our unsuccessful incorporation of the stepper motor and distance sensor.
#include <AccelStepper.h>
#define TOUCH_PIN 3 // Digital touch sensor pin
#define RED_LED_PIN 10 // Red LED pin
int state = 1;
int analogPin = A0;
// int triggerPin = 6;
// int echoPin = 11;
// long distance;
// float smoothing = 0.05;
// float smoothed;
int DIR_PIN = 3;
int STEP_PIN = 4;
int EN_PIN = 5;
// int sta
//AccelStepper stepper(AccelStepper::DRIVER, STEP_PIN, DIR_PIN);
void setup() {
Serial.begin(9600);
pinMode(2, INPUT); //sound sensor digital input pin
// analogPin(A0); // sound sensor analog input
pinMode(9, OUTPUT); // Green LED output pin
pinMode(RED_LED_PIN, OUTPUT); // Red LED output pin
pinMode(TOUCH_PIN, INPUT); //touch sensor input
pinMode(4, OUTPUT); //stepper motor output
// pinMode(echoPin, INPUT); //distance sensor input; add smoothing code to filter that data for the sensor
// pinMode(triggerPin, OUPTUT);
}
void loop() {
// if (state == 1) {
// state1();
// } else if (state == 2) {
// state2();
// } else if (state == 3) {
// state3();
// }
digitalWrite(9, HIGH);
state1();
state2();
// state3();
}
void state1() {
if(analogRead (analogPin) > 400){
digitalWrite(9, HIGH);
} else {
digitalWrite(9, LOW);
}
}
void state2() {
if(digitalRead(TOUCH_PIN) == LOW) {
// digitalWrite(9, LOW);
digitalWrite(RED_LED_PIN, HIGH); // Turn on the Red LED if the touch sensor is touched
Serial.println("Touch detected, Red LED ON"); // Optional debug message
} else {
digitalWrite(RED_LED_PIN, LOW); // Turn off the Red LED if the touch sensor is not touched
Serial.println("Touch released, Red LED OFF");
}
}
void state3() {
if(digitalRead(TOUCH_PIN) == LOW) {
digitalWrite(RED_LED_PIN, HIGH); // Turn on the Red LED if the touch sensor is touched
Serial.println("Touch detected, Red LED ON"); // Optional debug message
} else {
digitalWrite(RED_LED_PIN, LOW); // Turn off the Red LED if the touch sensor is not touched
Serial.println("Touch released, Red LED OFF");
}
//*THE FOLLOWING IS FOR THE STEPPER MOTOR FUNCTION*
//digitalWrite(triggerPin, LOW);
// digitalWrite(triggerPin, HIGH);
// digitalWrite(triggerPin, LOW);
// long duration = pulseIn(echoPin, HIGH, 17400);
// distance = duration / 29 / 2;
// smoothed = smoothed * (1.0 - smoothing) + distance * smoothing;
// Serial.println(smoothed);
// if(smoothed < 30); { // add after testing distance
// void setup() {
// pinMode(EN_PIN, OUTPUT);
// digitalWrite(EN_PIN, LOW);
// stepper.setMaxSpeed(1000);
// stepper.setAcceleration(500);
// stepper.runToNewPosition(200);
// delay(1000);
// stepper.runToNewPosition(0);
// delay(1000);
// }} else {
// pinMode(EN_PIN, OUTPUT);
// digitalWrite(EN_PIN, LOW);
// stepper.setMaxSpeed(1000);
// stepper.setAcceleration(500);
// stepper.runToNewPosition(0);
// delay(1000);
}
In terms of our design and building of the artifact, we made sure to schedule times that we were both available that way this aspect could be done by both of us jointly. Overall, despite the challenges with the incorporation of the other components and time constraints, the final product versus the product we showed at the User Testing Session showed great improvement especially in regards to the design aspect.
Our initial goal for the project was to create something enhanced the experience of diners by solving the problem of awkward interruptions by waiters. I think our goal was achieved in that we created an interaction between the diners, with the microphone sensing the levels of conversation, as well as an interaction between the diners and the waiter, through the touch sensor. Ultimately the audience interacted with our project just as we expected, despite the sensitivity of the microphone sensor being not as much as wanted and the LED's being relatively dim. If we had more time, we would've prioritized the functionality of the stepper motor so as to incorporate more lessons learned in class. Also I would've like to make the LED's brighter somehow, because after soldering the LED's to the wire, the LED's brightness decreased a lot. Through all the setbacks and changes made, I think I've learned a lot of lessons, most notably the importance of persistance and problem solving. I was so focused and stuck on the stepper motor and distance sensor code for quite a while, and that was taking up a lot of time that we didn't have, so I had to learn that pivoting due to time constraints is sometimes a necessary aspect of design. A large takeaway from my side was the importance of feedback and the value it has. Added perspective helps to take a step back, because I think when you're in the project for quite some time, it's hard to contextualize it in a way, which is where the feedback really provides its most value.