Final Report

Jackie Burd & Peter Feghali

Introduction

What does it mean to build something? prototype? develop a plan and construct something new? We did not aim to answer these philosophical questions while completing this project, nor did we. We rather focused on finding where our interests were, and developing a difficult project in that space. We built Shadow, a following robot.

Hardware

Bill of Materials

153B B.o.M

Microcontrollers

We had used the STM32 microcontroller for our previous lab in the class, as it's one of the more affordable ones on the market that still maintains high functionality. For the most part it managed the L298N motor controller based on the incoming speed data from the Arduino Uno via bluetooth.

The Arduino Uno microcontroller was the better choice for image processing in this project since it's more user friendly and saved us alot of time. It communicated the Arudcam images to a python script running on our laptop.


STM32L476G Microcontroller

Arduino Uno Rev3 Microcontroller

Peripherals

Our robot needed a way to take in its surroundings through visuals in order to calculate an appropriate path to follow. The Arducam Mini module is made with intention of being paired with an Arduino, and we first tried sending the images over wifi to be processing, but with no luck we ended up resorting to using a direct connection through *insert interface*.

We still needed a way to communicate data post-image-processing from the laptop to the STM32 that controlled the motors and wheels. Since the wifi chip was a bust, we sent the respective speeds of the left and right wheels over the HC-05 Bluetooth module via UART.

In order to handle speed control and spin direction of the 4 DC motors underneath out robot chassis, we used a single L298N motor controller. To ensure our robot moved fast enough, the motor controller took in 12V as well as 5V from external sources.

Arducam Mini Module Camera

HC-05 Bluetooth Module

L298N Motor Driver Controller

Software

STM32 Software

Image Processing:

Since we did not expect the STM32 to be able to efficiently process our images, we decided to build a python script to manage this. We used an Arduino to interface with the camera, as this would not interrupt the operation of the STM32. We then transferred the images from the Arduino to our laptop using a serial UART connection. By doing so, we received images at approximately 5\6 FPS and could process them in realtime. We leveraged 2d trackers to allow for spatial positioning. We used "Apriltags" which are built using known volumes such that based on skew data, we can calculate our position relative to their position. Then, using basic trigonometry, we could calculate how much we'd need to turn align ourselves with our target. Originally, we expected to get images at a slow speed, necessitating some sort of proactive estimation algorithm with encoders to conduct live-speed control and feedback based path planning, but we realized that with the relatively fast pace which we could process images and the inacuraccy of the motors, leveraging a more basic model would be beneficial. We processed the images on the laptop, generated desired PWN outputs, and sent them to the STM board over Bluetooth using the HC-05.

Motor Control:

Other than the basic reference voltages, the L298N motor controller requires 4 digital signals that change the wheel rotation direction, and 2 enable pins that take in PWM signals for setting the speed of each wheel. Out of the 4 digital signals (IN1, IN2, IN3, and IN4), the first two control the direction of the left side motors+wheels, while the latter two control the direction of the right side motor+wheels. Since our robot only has 1 motor controller but 4 motors, both left side motors and both right side motors are tied to the same command signals coming from the L298N with the help of a small bread board on the robot's underbelly. In order to turn, for example left, we made it so that the motors on the right side of the bot sped faster than the motors on the left. The speeds were all determined by the original image taken by the Arudcam as described above. But since the motor controller takes in a PWM, not digital signal, a program was written onto the STM32 that altered the compare and capture register value of an internal timer so that the desired duty cycle was produced.

Challenges

Wifi Module

We attempted to utilize a WiFi module to transfer data from the robot to our laptop for control, then back. We interfaced to the module with UART, and we did manage to send basic commands to the device, and create an access point. We then tried to create a TCP connection to the device to send commands and data, but found that that failed. We then tried to change the setup of the device, such that it connected to a WiFi hotspot, which we then could program the module to send data to our laptop via TCP. This also failed. We tried a couple other methods of debugging, and after most failing miserably, realized it made the most sense to stick with UART,

Chassis Size Constraint

Originally we had bought a small red chassis robot kit that comes with 2 DC motors and wheels, but as adding more and more elements to our robot we began to realize that we did not have enough room to fit everything. So we ordered a bigger chassis off amazon that came with 4 DC motors and wheels, which (with some creativity) could fit the Arducam, bluetooth module, Arduino, STM32, motor controller, 12V battery pack, as well as additional wiring/mini breadboards. We actually ended up needing a way to raise the Arducam above the rest of our robot so it was taking in pictures at a reasonable level, so the old red metal chassis came in handy as a way to mount our camera.

Differing Motor Speeds

After identifying the usable pins on the STM32 that could produce a PWM signal and programming their settings as necessary, once connected to the 4 DC motors through the motor controller all 4 wheel never spun at exactly the same speed. This meant that even though the same, 100% duty cycle was sent to both sets of wheels, the same side was always slower than the other. During our initial tests of having the robot move without taking in images, it never went in a straight line. After spending too much time trying to understand why this was the case, we decided to program around this problem, assuming it to be something inherent about the different internal timers used for each enable signal. Thus we always made sure to take into account the slower side of motors when determining how fast the robot should go and by what ratio should the power be split between left and right.

Post-Project Thoughts

Peter's Take

All in all, this was a successfully built project. While obviously not perfect, we managed to accomplish our goals in a reasonable timeline. There was a disproportionate amount of time put in trying to interface the camera with the STM32, and the continued failing of this board to interface with our camera was quite annoying. Nevertheless, by leveraging the arduino and separate interfaces, we managed to build a solid project.

Jackie's Take

Clearly, this project had not gone as planned. While it would have been nice for the bot to be entirely wireless and self-operating, I'm still extremely happy with how Shadow turned out. As a concept, the project sounds very feasible, and with enough time and resources I think Peter and I would have been able to follow through with our original vision. But running into problems oftentimes had to do with what our limitations were. We only had so much practice with the STM32 microcontroller, and a lot of the code written for it was modeled very closely after the programs we wrote for lab. Without the lab handout guidelines, it soon became very time consuming and tiresome to learn how to program new things with the STM32. Another limitation we had was the tools we had. A specific example of this actually was when the DC motors came in and we realize they had small pads for soldering the positive and negative inputs rather than convenient wires. This meant I spent a couple hours carefully using my own soldering iron I luckily brought from home to attach the necessary wires. Just as I had finished soldering on all necessary connections, I accidentally broke my soldering iron when trying to untangle it's cord. Wasn't too sad given it was $5 at Fry's, but still that meant I couldn't go back if anything disconnected so I was overly cautious with the electrical tape to hold everything in place.

These things among many other struggles with this project taught me not only how the topics we learned in class are applicable, but so much about how to debug hardware and come up with creative solutions. It can get pretty exciting when an issue you've been working finally gives you the desired output, that maybe a couple of days ago was a totally mystery to solve. Despite Shadow being no where close to the same scale as a hardware projects in actual industry, I just see it as maybe one step closer to being able to take on such feats.