Orientation of the robot occurs at the beginning of the sequence. After being placed in a random orientation within the start box of the course, the robot must be able to sense its surroundings and align itself to the field. We decided to accomplish this by using distance sensors, which we would place on certain faces of the robot.
Of the two proposed orientation strategies, we settled on the second because it would likely require fewer rotations (and less time) to orient itself. Next, we had to decide whether to use VL53L0X Time of Flight sensors or HC-SR04 ultrasonic sensors to measure the distance from the robot to the walls. The former, which tracks the amount of time for a light signal to travel from the sensor to the wall and back, was more compact, but potentially more expensive if we couldn’t source it for free. The latter, which operates similarly but with an ultrasonic signal, was cheaper and had a wider range, but required more pins on the Arduino. After testing both sensors, we found that the readings from the Time of Flight sensor were more accurate, especially when rotating about corners.
Early testing of the orientation code used two Time of Flight Sensors paired with an ultrasonic until we were able to obtain a third ToF sensor. The sensors were taped onto a foamcore box that approximated the size of the final robot (we named it Frybox). This was an easy way to prototype the subsystem without having to wait for the drivetrain to be built first. Note that when using multiple ToF sensors, an I2C multiplexer, which allows multiple sensors with the same fixed I2C address to be hooked up to one microcontroller, was necessary. We used the TCA9548A. This ended up being quite beneficial for pin conservation because it only required the SCL and SDA Arduino pins.
The code was quite simple. Upon initialization, the robot rotated about itself until the sensor readings fulfilled this condition:
· The front sensor readings were roughly equal
· The side sensor read a small distance
· The front sensors read a large distance
The alignment during this process did not have to be 100% accurate because orientation was followed by backing up into the corner walls of the course to ensure that it was fully aligned for navigation. This allowed us to be generous with the quantitative thresholds, which were widened to account for offsets, noise, slow reading frequency, etc.
Code testing was a frustrating endeavor. I was often unable to initialize the sensors and the reading printouts were unreliable. After much hair-pulling, I asked Jingyi to upload the code from her computer and…it worked perfectly. Sometimes the problem is your computer.
One of the biggest hurdles to overcome with the orientation code was memory. After connecting three ToF sensors to the Arduino, the code used 59% of program storage space and global variables used 104% of dynamic memory. We discovered that the Adafruit ToF library we were using was unnecessarily massive. To fix this, we switched to the more space-efficient VL53L0X_mod library.
Lastly, we needed to figure out how to mount the sensors onto the sides of the robot.
The 3D printed mounts were designed in two parts so that the sensor could snap in and out without needing to tighten and loosen screws. We had to create two different sized enclosures because our third sensor was smaller than the others. The mounts required a couple rounds of iteration, but we were eventually able to achieve a perfectly snug fit! The final mounts were printed, assembled, and attached to the bottom of the top plate on the robot.