My project is an eight-legged walking robot inspired by the Theo Jansen mechanism, controlled through a mobile phone. Unlike traditional wheeled robots that struggle on rough terrain, this design walks in a way that feels more natural and adaptable. What makes it even more exciting is that it won’t just move mechanically—it will have a sense of personality, showing expressions on a screen and reacting with sounds, making the interaction playful and engaging.
I care about this project because legged motion has always fascinated me. It brings together engineering and art, creating something that feels alive—more like a creature than a machine. At the same time, I wanted to reimagine what a remote-controlled toy could look like. Instead of just another car with wheels, why not a robot that walks on eight legs? That complexity can spark curiosity, especially in children, by introducing them to more complex mechanisms early on. I believe exposure like this can broaden their imagination, fuel creativity, and inspire curiosity about how things work.
The inspiration came from Theo Jansen’s Strandbeests, those mesmerizing kinetic sculptures that stride gracefully across beaches without using wheels. Adapting that concept into a playful, controllable robot feels like bringing together functionality, creativity, and imagination into one project.
Software
Software Used: Fusion 360
Fusion 360 was the main tool for designing the robot. I used it to create the 3D models of the structure, simulate the Theo Jansen mechanism, and ensure that the moving parts had proper clearances. It also allowed me to assemble the full robot digitally before starting the physical build, saving time and avoiding design errors.
Website
Website Used: Fusion 360
GrabCAD provided access to accurate CAD models of electronic components such as the Arduino Uno, ultrasonic sensor, and DC yellow motors. Importing these models into Fusion 360 helped me check proper fit, spacing, and integration of electronics with the mechanical frame, ensuring everything would align correctly in the final assembly.
Source of Inspiration
System Parts
1) Theo Jansen Mechanism
The walking mechanism is composed of several link assemblies that combine to form the Theo Jansen leg system. Each part was modeled in Fusion 360 using precise link lengths to ensure a smooth and stable walking motion. Below are the links' names and dimensions:
After extruding each sketch to a thickness of 3 mm and completing the assembly, the following structure is obtained:
2) Theo Jansen Spur Gear Package
Each gear was designed with a thickness of 6 mm. To achieve this using the laser cutting, the gear was cut twice at 3 mm each. An additional 3 mm extension was added, along with a 3D-printed spacer (2 mm) to ensure proper alignment and spacing.
Module (m): 2.914 mm
Number of Teeth (z): 14
Tool Used: Add-In: “Spur Gear” Generator on Fusion
After assembling these parts along with the Theo Jansen Mechanism, we obtain the following:
3) Main Gear + System Holder
Module (m): 2.914 mm
Number of Teeth (z): 9
Tool Used: Add-In: “Spur Gear” Generator on Fusion
4) Main Base
5) Enclosure
6) Top
7) Holders
8) Other Parts
9) Other Materials
M3 Bolts and Nuts with different lengths
M3 Bolts and Nuts with different lengths
(x4 Bolts + x8 Nuts)
6mm Metal Rod (x1)
606ZZ 6 x 17 x 6 mm Deep Groove Ball Bearing (x4)
Final Assembly
Machines Used
Software Used: Ultimaker Cura
Printer Model: Prusa i3 Mk3/Mk3s
Material Used: PLA filament
I sliced the model using Ultimaker Cura, then uploaded the G-code to a Prusa i3 Mk3/Mk3s for printing, using PLA filament as the material.
Prusa i3 Mk3/Mk3s
Software Used: RDWorks
Printer Model: El Malky ML149 CO₂ Laser Cutter
Material Used: 3mm Plywood
I used RDWorks for design, the El Malky ML149 CO₂ Laser Cutter for cutting, and 3mm plywood as the material—it’s sturdy, clean to cut, and perfect for detailed designs.
El Malky ML149 CO₂ Laser Cutter
Laser Cutting Fabrication Process
For Laser Cutting Preparation
After exporting the glasses frame sketch as DXF format, I imported the file into RDWorks. I set the appropriate parameters for each line and adjusted the cutting settings to 30.0 for speed and 50 for power. Finally, I exported the design as a .ai file and sent it to the El Malky ML149 CO₂ Laser Cutter for fabrication.
Laser Cutting Process and Result
To implement the laser cutting part of the assignment, we first downloaded the design file to the laser machine and selected it from the interface. Before starting the cut, we made sure to attach the stabilizer and carefully adjust the distance between the laser head and the board—a small detail that makes a big difference in the final result.
Once everything was set up, we started the cutting process, and the machine followed the design precisely. After the piece was cut, I used brown spray paint to give it a more polished and finished look.
3D Printing Fabrication Process
For 3D Printing Preparation
I exported the model as an STL file, then used Ultimaker Cura to prepare it for printing with the following settings:
Layer Height: 0.2 mm
Infill: 100% (Since the strength of the components was crucial)
Adhesion: Enabled
Support: Normal – Touching Buildplate
Estimated Weight: 55 g
Estimated Print Time: 6 hours and 16 minutes
I also adjusted the heat settings based on the lab instructor’s recommendations.
Once everything was ready, I exported the .gcode file and copied it to the printer’s SD card — and it was ready to go!
3D printing Process
To start the implementation, we first exported the G-code for our design and copied it to the 3D printer’s SD card. After inserting the card into the printer, we selected the correct file from the menu.
Before hitting start, we made sure the printer bed was clean and clear of any debris. Once everything was set, we pressed Start, and the printer began working—building the design layer by layer based on the G-code instructions. In the end, we were able to see our digital model turned into a real, physical object.
It was a simple but satisfying process, especially seeing the final result take shape in front of us.
Components
Function: Measures distance to obstacles by sending ultrasonic pulses and calculating the echo time.
Role: Provides real-time feedback to avoid collisions.
Function: Wireless communication interface to receive commands from a smartphone or computer.
Role: Allows the user to control or configure the robot remotely.
Function: Central controller that processes sensor input and executes decision-making logic (program code).
Role:
Reads sensor data (ultrasonic distance, Bluetooth commands).
Makes decisions (move forward, stop, step back, turn, play sound, etc.).
Sends control signals to the action components (motors, buzzer, LEDs).
Integration: Acts as the bridge between inputs and actions — it is the “brain” of the smart system.
Function: Provide motion to the robot.
Role: Move forward, backward, or turn depending on control signals.
Driven by: Motor driver.
Function: Acts as a power interface between Arduino and DC motors.
Role: Amplifies low-power Arduino control signals into high-current signals that drive motors.
Function: Produces sound alerts or fun tones.
Role: Indicates events (like obstacle detected, idle sound, or startup greeting).
Function: Visual indicators for system states (e.g., ON/OFF status, obstacle detected, Bluetooth connected).
Integration: Arduino turns LEDs on/off or blinks them depending on conditions.
Software Used
Function: Used to design and document the electronic circuit of the system.
Description: Fritzing is an open-source software that allows you to create circuit diagrams, breadboard views, and PCB layouts in a visual and beginner-friendly way. In this project, Fritzing was used to represent the wiring of Arduino with sensors, motors, and other components for clear documentation and prototyping.
Circuit
I started with the Arduino Uno and a 9V battery. The battery powers both the Arduino and the motor driver: one connection goes directly to the motor driver for the motors, and another goes in parallel to the Arduino itself. (In Fritzing this parallel link isn’t shown, but in practice it’s just a simple split).
The motor driver (L298N) controls the two DC motors. Since the Arduino cannot supply enough current on its own, the driver acts as an interface.
The enable pins (ENA and ENB) are connected to Arduino PWM pins, so I can control speed.
The input pins (IN1–IN4) are connected to Arduino digital pins, which control direction.
On a breadboard, I added four LEDs (green, yellow, red, and white). Each one has a 220Ω resistor to prevent damage. They were connected to pins A5, D4, D3, and D2 on the Arduino, with their grounds tied back to Arduino GND. These LEDs act as status indicators.
The buzzer is connected to pin D5. Because it’s on a PWM pin, I can program it to play different tones instead of just one flat beep, which makes the robot more interactive.
The Bluetooth module (HC-05) allows wireless communication.
VCC → 5V, GND → GND.
TX → Arduino RX, RX → Arduino TX.
This lets me send commands from my phone app to the robot.
The ultrasonic sensor (HC-SR04) is used for obstacle detection.
VCC → 5V, GND → GND.
Trig → Arduino D12, Echo → Arduino D13.
By sending and receiving sound pulses, the Arduino can calculate how far an object is.
The power source for the project is a rechargeable lithium-ion battery pack (18650 cells with a Battery Management System). This pack was chosen because it provides a higher capacity and longer runtime compared to a regular 9V battery. It can safely power both the Arduino Uno and the motor driver through a parallel connection.
The integrated BMS ensures safety by protecting the cells from overcharging, over-discharging, and short circuits. This makes the system more reliable and reduces the risk of damage to components.
Mobile App Design & Programming
MIT App Inventor is a free, web-based development platform created by the Massachusetts Institute of Technology. It allows users to build fully functional Android applications using a block-based visual programming language. Instead of writing traditional code, developers drag and drop logical blocks to define the app’s behavior. This makes it especially useful for rapid prototyping, educational projects, and IoT/robotics applications where mobile interaction is required.
In this project, I used MIT App Inventor to create a custom Bluetooth control app for the robot. While there are many generic Bluetooth controller apps available, most of them only send a single signal per button press. This is inefficient and unsuitable for our robot because we need continuous control—for example, the robot should keep moving forward as long as the forward button is pressed, not just after one click.
MIT App Inventor allowed us to design an app where:
Continuous signals are sent to the Arduino via Bluetooth while a button is held down.
Multiple dedicated buttons can be customized for movement (forward, backward, left, right) and other actions (stop, LEDs, buzzer).
The interface is simple and intuitive, tailored specifically to our hardware setup.
By choosing MIT App Inventor, I ensured smooth, real-time control of the robot, something not possible with off-the-shelf Bluetooth terminal apps.
Added control buttons, customized their color, text, and position for easy use.
Integrated Bluetooth with a timer for continuous signals and a ListPicker to select devices.
Handles Bluetooth connection. When a device is selected from the ListPicker, the app attempts to connect. If successful, it sends the message “Connected”; otherwise, it sends “Failed”.
Initializes the timer settings (condition and interval) and defines the global variable CurrentCommand.
Controls the Forward button. While pressed, it continuously sends “F”. When released, it sends “S” to stop the motors until another command is given.
This block is used to keep sending the signal while the button is pushed
Arduino Programming
My source of inspiration is Wall-E. I love how Wall-E has such a cute and fun personality, even without speaking words. I try to give my robot a similar vibe—playful, expressive, and full of character—so it feels alive and enjoyable to interact with.
This part sets up all the “parts” of your robot. It tells the Arduino which pins control the motors, the buzzer, and the ultrasonic sensor. It also sets an initial command ('S') so the robot knows to stay still when it first powers on.
Think of this as your robot “waking up.” It sets all the pins as inputs or outputs, starts the Serial connection for Bluetooth, prints a starting message so you know it’s alive, and makes sure the motors are stopped at the very beginning (because that is a problem I faced where the Robot just takes off which could damage it).
Checks if a Bluetooth command has arrived.
Measures the distance to anything in front using the ultrasonic sensor.
If something is too close, it calls the obstacle handler.
Otherwise, it looks at the command and decides what to do—move forward, backward, turn, curve, or even show emotions like happy, sad, or angry.
If the command is unknown, it stops by default to be safe.
They tell the motors exactly how to move:
stopMotors() – freeze in place
forward() / backward() – move straight forward
right() / left() – rotate in place
forwardLeft() / forwardRight() – move in a curved path
Each function sets the motor directions and speed to match the movement.
getDistance() – measures how far an object is in front.
handleObstacle() – stops the robot, plays a scanning sound, backs up a little, and keeps checking until it’s safe to move again.
The functions happy(), sad(), angry() – stop the motors, pause briefly, and then play the corresponding sound to express an emotion.
soundPowerUp() – startup tune to let you know the robot is ready.
soundObstacle() – scanning beep when something is too close.
happySound(), sadSound(), angrySound() – unique sound patterns to match each emotion.
The robot’s sounds were generated with AI help. The AI suggested tone sequences for emotions like happy, sad, and angry. I tested and tweaked them on the buzzer to avoid harsh tones and make them expressive.
Lesson learned: AI gives creative ideas fast, but human adjustment is needed to make them sound good on real hardware. Now the robot has a playful, personality-driven “voice.”
Step 1 : Mounted and prepared the elctronics components to be tested
Step 2: Prepared the motor driver and the two motors and then tested them using a simple movement code
Step 3: connected the Bluetooth module to test the response of the motors
Step 4: Mounted all electronic parts to do a full enclosure
Step 5: Fully assembled the Robot after testing all the electronic components
At first, it was difficult to achieve smooth, continuous control of the robot using standard Bluetooth apps, since they only sent single signals per click. This was solved by designing a custom mobile app in MIT App Inventor, which allowed continuous commands while a button is pressed.
The main shaft connected to the motor faced high friction when in contact with the wooden frame, causing inefficient rotation. The issue was resolved by mounting bearings, which reduced friction and allowed smoother, more reliable movement.
App Layout Upgrade – Improve the mobile app interface with clearer button icons (arrows, stop sign, etc.) and maybe add vibration feedback when a button is pressed for a more polished user experience.
Battery Level Indicator – Add a simple voltage sensor to monitor the 9V battery and use the app or LEDs to alert when the power is low.
Obstacle Distance Display – Instead of only reacting to obstacles, show the measured distance from the ultrasonic sensor on the app screen in real time.
Speed Control – Add buttons in the app to adjust the robot’s movement speed (slow, medium, fast) for better handling.