Bottom: Passive current sense circuitry
The Raspberry Pi shield used hall-effect encoders to detect the speed and direction of both wheels without requiring a mechanical connection to the encoder. However, we realized that the placement of the encoders was too critical to their operation and it was difficult to get the encoder readings from the AVR motor controller. Now we have switched to a BeagleBone Black, and are not using the encoders unless open-loop control is insufficient. The slopes are the only two areas of the course where the motors may need a controller, so we will use the current delivered to the motors as an indicator for whether we are on a ramp. The motor driver has a low-side current sense resistor, but the bandwidth and swing of the signal is outside the spec of the BeagleBone's ADC. Currently, I am using an RC lowpass filter with cutoff frequency of 160Hz to both limit the bandwidth, and time-average the signal to remove the effect of PWM. To protect the ADC, I use two 1N4002 diodes to clamp the output at about 1V, and include resistors on both sides of the clamp to ensure that the BeagleBone can safely use the pin as an output on startup. I am also looking into an active amplifier solution.
Power supply spikes: Servos can create some nasty spikes on the supply rail. The steering servo was originally on the 5V rail shared with the MCUs, but this would cause the MCUs to reset periodically. Moving the servo to a separate 6V line solved the reset issue, but the spikes, albeit smaller, were still on the 5V rail. We faced this reset issue again when working with the drive motor. This time, the Amazon NiMH batteries were limiting the current rate of change such that supply spikes when the motor was switched on caused some MCUs to reset. This is probably a consequence of manufacturing the batteries to have a higher cell voltage and function as better equivalents to Alkaline batteries. The stock batteries and increasing the PWM frequency solved the issue.
Some of the problems we encountered and solved:
20th Annual Mobot Race (2014)
In Spring 2013, we competed as a team of 6 undergraduate students from the Carnegie Institute of Technology (5 ECE, 1 Mech). The robot took 1st place in the Undergraduate category, passing through 12 gates in 1:48 on the first try. The wind and rain became an increasing problem on the next two official trials, but it successfully completed the race five minutes after our last trial on an unofficial run! Just comes to show the complexity of operating in an outdoor environment.
The School of Computer Science at CMU holds an annual competition where robots must follow a painted 255' white line on the concrete sidewalk outside Wean hall. Robots must navigate several curves and two steep inclines as well as branches at the end of the race for the fastest time. Holding the competition outdoors poses several issues, primarily that traditional line sensing components that operate in the infrared spectrum can be blinded by sunlight.
I compete along with a team of 4-6 sophomores annually. We won first place in both the 19th and 20th annual races.
19th Annual Mobot Race (2013)
In Spring 2014, we competed again as a team of 4 undergraduate students (2 ECE, 1 CS, 1 Mech). The robot took first place again in the Undergraduate category, passing through Gate 9 in 1:46. We originally planned to use our new robot, but as it was still under development, the old robot was used.
21st Annual Mobot Race (2015)
We will be competing again this Spring hopefully with our new RC car Mobot. Two of us are also running a Mobot Tutorial through the Carnegie Mellon Robotics Club to spur additional interest in the competition.
Our first Mobot was intended as a prototype for a future robot, so we used off-the-shelf boards. The robot uses an Arduino Mega as its "brain", tracks the line with the CMUcam4, and drives using a tuned PID controller board driving two gearmotors. The CMUcam4 has an onboard processor (Parallax Propeller) which is preprogrammed to do some of the image processing. It gives the Arduino the center of the line for rows in the image, which the Arduino then uses to approximate a trajectory and speeds for each motor. The CMUcam4 also provides some noise filtering. The speed values for the motors are passed to the PID controller board, which is tuned at the factory for the specific gearmotors we used. Power is supplied from a 12V battery on the underside of the robot, then split into a 5V and 12V rail for logic and motors respectively.
In addition to developing the algorithm to calculate speed values for a given snapshot of the line, we also faced problems with wheel slippage on the steep slopes of the course, and image noise. The course includes two steep slopes. The robot would often turn to track the line on the slope, causing one of the wheels to lose traction and the robot to slip and fall. We spent a few weeks testing various treads, from rubber bands to sandpaper, and finally rubber tread tape, to maximize traction on the ramp so we could still drive at a reasonably fast speed.
During the spring before our first competition, we found that the sun often caused rocks to have reflections very close to the characteristic of the line. The line had also degraded with the salt used in winter. A neat solution we found was to cover the camera with a plastic bag, which not only provided a constant-time blurring filter, but also protection from the rain. I had brought some samples of metallic plastic specifically designed to reflect sunlight, but the plastic bag I brought them with turned out to be the best filter.
Mobot v1.0 was a very reliable robot even though it was intended as a prototype. However, we wanted to have more control over the image processing and motor control. Also, Mobot v1.0 was running at very close to max speed for much of the track. So we began work on a new robot, which is based on RC-car platform, and has a Raspberry Pi "brain". Drive electronics were on a shield for the Raspberry Pi, and included three AVR microcontrollers to control encoders and motors, servo, and auxiliary sensors respectively. Although the tasks could be completed by one AVR, using three small AVR's was an experiment in distributed system design and to see if this topology would help debugging efforts. By the end of my Sophomore year, we had a driving robot that could be controlled using a computer, however we faced numerous power and encoder related issues.
In my Junior year, we decided to switch platforms to BeagleBone Black because it has more I/O with peripherals of a microcontroller, specifically PWM and ADC. There was also a very small cost to switch because we had almost all the components. Currently, I am working on building the drive and current sensing circuitry.
blue: PWM to Servo
green: 5V rail to MCU
yellow: 6V rail to Servo
Both rails from 7.2V battery
Communication in a distributed system:The I2C protocol was used between drive electronics and the Raspberry Pi. This made it easy to isolate the two power supplies since only one chip was required for isolation. However, I2C is not ideal when only one slave is most often being addressed. The Raspberry Pi would most often send commands to the steering controller MCU, which was an ATtiny24. This MCU also performed auxiliary functions for debugging, so for an 8-bit turn value, an additional command byte would be necessary to specify if the MCU should adjust the turn value or another function. The ATtiny24 uses a barebones USI module to handle I2C, and leaves address checking to software. This allowed us to give two addresses to the MCU, one for controlling the steering servo, and the other for auxiliary functions.
The steering servo was a significant load on both the power and signal input, causing the power rail spikes described above, and would load the input signal. I used multiple filtering capacitors on the battery voltage which feeds the servo, and an LDO with 0.5V dropout to supply the BeagleBone Black. To prevent signal loading from damaging the BeagleBone Black, I used a two-inverter buffer shown in the schematic below.