ECE153B Project
Group Members: Andrew Chen, Aaron Sin
Project Demo Video
![](https://www.google.com/images/icons/product/drive-32.png)
Code can be found at the Github page: https://github.com/sqwertyl/ece153b_final_project
Proposal
The goal is to build a drink dispenser robot that can move around in a room to dispense drinks. The robot also senses the distance from the cup to the robot to make sure it doesn't leave a mess everywhere. It will be controlled by a Wii Nunchuk using the joystick to move and buttons to dispense.
Block Diagram
Peripherals
Ultrasonic Sensor
Bluetooth Module
Motors
Servos
Temperature Sensor
LCD Panel
Protocols
UART: Bluetooth
I2C: Temperature sensor
SPI: LCD Panel
PWM: Motors, Servos, Ultrasonic Sensor
Details
Movements
Movements inputs received via bluetooth from computer inputs
Movements are carried out by 4 motors acting as wheels.
Dispensing
When the ultrasonic sensor detects a distance < D (D will be defined later), it dispenses the water.
The water is dispensed by turning the servo open the cap.
Display
The LCD display displays a smiley robot face.
LCD display also displays the drink temperature.
Structure
All interfacing protocols will be using C using Keil uVision to prototype and produce the final product. UART will be used to communicate with the robot through Bluetooth. I2C will be used to fetch temperature data from the temperature sensor that will measure the temperature of whatever fluid is being stored. SPI will be used to interface with the LED panel and whatever libraries we need to display custom images onto the display. PWM will be used to control the motors and servos to allow the robot to move and dispense the fluid.
Responsibilities
Aaron Sin: SPI & PWM protocols
Andrew Chen: UART & I2C protocols, will help with programming graphics onto LCD display
We will both design the robot structure where the components will be placed.
Sourcing components will come from whatever we have currently, and will purchase if needed.
Potential Improvement
As the project is in progress, here are a few improvements that might or might not be implemented depending on time:
The robot can move around without a person manually controlling it. When it detects a cup in the environment, it will move towards it and dispense drink. This requires some sort of visual/image sensor.
Movement inputs may also receivev inputs from an android phone app. The app may even implement feature such as restart, reset, lock, unlock, sleep, etc.
Updates and Setbacks
Due to time constraints we had to cut back on some objectives such as control through bluetooth and the LCD screen. Instead we opted to control the robot through the Wii Nunchuk and through UART we can send debug information.
Aaron Sin
aaronsin@ucsb.edu
Andrew Chen
andrew234@ucsb.edu