Table Talk
Artist Statement: In a world where dialogue and debate has become so extremely polarized and binary, how can we return to healthy conversation that encourages people to listen and respond to both sides regardless of position/opinion? Table Talk is a conversational game that encourages viewers to write yes-no questions surrounding this living-breathing table, which forces players to sit at opposite ends of each other arms and reach away from the center. When players pick a question, they place it on the table, and are then invited to press yes/no buttons on their side (concealed from the audience and the other person) and the table displays whether they agreed (green) or disagreed (red). Players will discuss with each other the outcome of their positions, leading to a healthy conversation where the table reflects and encourages them to acknowledge each other's position and discuss it. Subtle interactions like loading up questions, the table breathing, and even mapping the pressure of the "yes/no" answers aim to start dialogue and make players interact with each other and the larger community (through the question cards).
Understanding the Logic
The first part of the physical computing involved understanding my inputs and outputs and what the outcomes of each would be – because there were 2 inputs from 2 people (FSRs) that led to one output (NEOpixel)
A key part was understanding the truth table from the inputs, and creating more boolean gates to prevent "INVALID" answers (what if someone pressed both yes and no at the same time). This process made me pay attention to the user interaction/journey through my code – what if one person presses a button and the other doesn't? What if no buttons are being pressed? Understanding these edge cases helped me build a more cohesive set of interactions that fit together in a single narrative.
Prototyping with Interaction with People
After understanding the underlying logic, I wanted to start user testing to see changes I should make – whether with the coded interaction or fabrication. With a test run with Petra and Evie, I saw the experience when both players closed their eyes, couldn't see each other for the entire conversation, and even eyes open. It was clear that seeing the answer the other person was going for would influence the outcome and make it somewhat uncomfortable. This led me to the fabrication choice of concealing the buttons from each player and having distance between them so that the interaction wasn't too close/open for both players to see.
Prototyping Fabrication
Above are scaled maquettes I made to simulate the table in a 3:1 scale. This was a rough blueprint to understanding the fabrication involved (all the part), assembly, and the need for soddering different elements together.
Combining Interaction with Physical
Above I incorporated the IR sensor to detect the question, and started coding interactions that allowed for longevity of the device (it isn't a one-and-done interaction, but one that sets and resets every time it is activated/understands that user journey). This on top of thinking about that loading animation and how the answers come together for a dramatic reveal.
Some key things I needed to understand where the use of timers! How to time interactions for things that were almost animation like.
The Build
The process for building took much longer than expected – I couldn't solder the wires until I was sure of the proportions from the physical 1:1 scale model. I wanted the structure to look seamless, and there was no wood available in the side to cut a full side of my table (meaning I had to make it in half). This meant having to cut the whole table in half on laser cutter and then gluing/sanding it down. This was a very tedious process, as the wood glue did stain the wood meaning it required a lot of sanding. Additionally, for the table top, I needed to grip hold the wooden frame to acrylic with epoxy. Additionally, needed to cut out supports to hold the two sides together + keep the structure from collapsing.
Reflection
Ultimately, I'm quite pleased with how the project turned out (and that it turned out at all). The night before, a small debug I planned to do got turned upside down and even ended up frying an arduino board. I tried using a while loop while having timers and counters and overall had very disorganized code. Thanks to the help of Vishal (TA for Zach's previous class) and ChatGPT, I was able to clean up code and figure out problems. In terms of the software side, while this had a simple logic gate it was the interactions that took a lot of time to solidify. If given more time, I would've loved to have a display interaction for the final outcome where the color moves in gradients? Ultimately, I need to pseudocode my interactions well in advance and not be afraid of cleaning up code regularly.
On the fabrication side, I was also not completely satisfied with my output. I think what made me enjoy it were the small details (having laser-etched text on each side, having made branding for the question cards, presenting the project under certain lighting, etc.). However, the overall structure's craft is questionable, and I think understanding how wiring fits in the table would help a lot (I did have wires "pushed into places" in some areas and it was very trial and error to have internal structures that held the table together while letting wires run).
Overall, I am satisfied with the project – considering I started this class with 0 knowledge in physical computing, and suddenly was able to regularly take out an arduino, make code, change it, and prototype is insane. For future goals, I want to follow this pattern of maybe simpler logic gates but focusing on those tiny detail interactions that bring the experience together.
ChatGPT Convos
https://chat.openai.com/share/85008659-935e-4f92-ba61-be5520d806e0
https://chat.openai.com/share/07f70bc7-4525-45b5-9238-58409bdf0a4d
#include <Arduino.h>
#include <Adafruit_NeoPixel.h>
// PINS
const int no1 = A1;
const int yes1 = A0;
const int yes2 = A2;
const int no2 = A3;
const int proximity = A5;
const int neoPixelPin = 9;
// TIMERS
const int numPixels = 50;
const int delayInterval = 1000;
const unsigned long loadingDelay = 100;
// THRESHOLDS
const int YES1_THRESHOLD = 70;
const int NO1_THRESHOLD = 30;
const int YES2_THRESHOLD = 30;
const int NO2_THRESHOLD = 30;
const int PROXIMITY_THRESHOLD = 400;
// GLOBAL VARIABLES
uint8_t r = 0;
uint8_t g = 0;
int yes1Amount = 0;
int no1Amount = 0;
int yes2Amount = 0;
int no2Amount = 0;
bool finished = false;
// INTERACTIONS
float waitAnimation = 25;
float wInc = 0.1;
Adafruit_NeoPixel strip;
void setup() {
// Initialize serial communication
Serial.begin(9600);
// Set pin modes
pinMode(yes1, INPUT);
pinMode(no1, INPUT);
pinMode(yes2, INPUT);
pinMode(no2, INPUT);
pinMode(proximity, INPUT);
// Initialize NeoPixel strip
strip = Adafruit_NeoPixel(numPixels, neoPixelPin, NEO_GRB + NEO_KHZ800);
strip.begin();
}
void loop() {
clearStrip();
readInputs();
if (checkQuestion()) {
if (answersIn()) {
clearStrip();
checkAgreement();
} else {
// Signal to input answers
for (int i = 0; i < numPixels / 7; i++) {
strip.setPixelColor(i, 255, 255, 255);
strip.setPixelColor(numPixels - i, 255, 255, 255);
}
}
strip.show();
} else {
displayDecisionColor(waitAnimation, 0, 100-waitAnimation);
waitAnimation += wInc;
if (waitAnimation < 25 || waitAnimation > 100) {
wInc = -wInc;
}
strip.show();
}
}
void checkAgreement() {
loading();
// Check for excessive button presses
if ((yes1Amount > YES1_THRESHOLD && no1Amount > NO1_THRESHOLD) ||
(yes2Amount > YES2_THRESHOLD && no2Amount > NO2_THRESHOLD)) {
clearStrip();
Serial.println("INVALID");
return;
}
int total = yes1Amount + no1Amount + yes2Amount + no2Amount;
int more = constrain(map(total, 0, 1980, 0, 255), 0, 255);
if ((yes1Amount > YES1_THRESHOLD && yes2Amount > YES2_THRESHOLD) ||
(no1Amount > NO1_THRESHOLD && no2Amount > NO2_THRESHOLD)) {
// AGREE
r = 255 - more;
g = more;
displayDecisionColor(r, g, 0);
Serial.println("AGREE");
} else if ((yes1Amount > YES1_THRESHOLD && no2Amount > NO2_THRESHOLD) ||
(no1Amount > NO1_THRESHOLD && yes2Amount > YES2_THRESHOLD)) {
// DISAGREE
g = 255 - more;
r = more;
displayDecisionColor(r, g, 0);
Serial.println("DISAGREE");
} else {
clearStrip();
Serial.println("NEITHER");
}
while (checkQuestion()) {
strip.show();
}
finished = false;
}
void loading() {
unsigned long startTime = millis();
unsigned long currentTime = millis();
while (!finished && (currentTime - startTime) < delayInterval) {
int inc = (255 / (numPixels / 2));
for (int i = 0; i <= (numPixels / 2) && !finished; i++) {
strip.setPixelColor(i, 255-(inc * i), 255-(inc * i), inc * i);
strip.setPixelColor(numPixels - i, 255-(inc * i), 255-(inc * i), inc * i);
strip.show();
delay(loadingDelay);
readInputs(); // Keep reading inputs during loading
if (i >= numPixels / 2) {
finished = true;
}
}
currentTime = millis();
}
}
void readInputs() {
yes1Amount = analogRead(yes1);
no1Amount = analogRead(no1);
yes2Amount = analogRead(yes2);
no2Amount = analogRead(no2);
}
bool answersIn() {
return (player1Pressed() && player2Pressed());
}
bool player2Pressed() {
return (yes2Amount > YES2_THRESHOLD || no2Amount > NO2_THRESHOLD);
}
bool player1Pressed() {
return (yes1Amount > YES1_THRESHOLD || no1Amount > NO1_THRESHOLD);
}
bool checkQuestion() {
return (analogRead(proximity) > PROXIMITY_THRESHOLD);
}
void clearStrip() {
for (int i = 0; i < numPixels; i++) {
strip.setPixelColor(i, 0, 0, 0);
}
strip.show();
}
void displayDecisionColor(uint8_t red, uint8_t green, uint8_t blue) {
int min = 0;
int max = numPixels;
if (!checkQuestion()) {
min = (numPixels / 2) - 5;
max = (numPixels / 2) + 5;
}
for (int i = min; i < max; i++) {
strip.setPixelColor(i, red, green, blue);
}
}