This week I focused on assisting the mechanical/electrical construction of the throwing launcher, as well as performing the initial testing of the ball detection mechanism. The main roadblock this week has been generating a workable sphere detection model.
Ball Detection
The initial idea tested for ball detection was to look for circles in the image created by the depth camera. This way, no images of the ball or weird lighting effects would effect the object detection; only nice, round objects could be picked up as a circle. I used a first generation XBOX Kinect with ROS via the OpenNI package, which returned the image representation of the depth camera on the topic /camera/depth/image. This image was fed into the cv2 HoughCircles() function, which was supposed to find the circle of the ball within the image. The results are shown below, but they are not promising.
Moving forward from this point, I have been recommended using RANSAC to fit a hemisphere directly on the point cloud generated by the Kinect. My goal is to try out this method over the weekend, to see how viable it is.
Since last lab report, I have made considerable efforts designing the launching mechanism of our catch-playing robot. I began by using AutoCAD to model simple versions of our motors and spaced them as to allow a 3.75” gap between the two spinning disks of our pitching-style launcher. The ball measured at 4” diameter, so I thought the ¼” compression of the ball would be enough to grip it and launch it. From there I designed a structure made of ¼” laser cut wood. Using a laser would allow us high precision and minimize fab time. The drawbacks of using laser cut wood is that our structure would be very weak, but that was acceptable for a prototype. All pieces were dogboned to reduce time in fabrication.
There have been a litany of roadblocks in the last week:
Since last week I have made a lot of progress on the design of the robot’s electronics. Basic control code for the launch actuator has been made and tested with the servo and laser cut launch mechanism. The servo pushes the ball into the flywheels that will launch it. Currently the code runs the actuator automatically, with a 1-2 second delay between movements.
I found a PWM based DC motor controller on Amazon for $8. Most of the DC motor controllers I found had a potentiometer for speed control and a switch for direction (H-bridge types). We are only interested in being able to modulate the speed of the motors. I found this simple controller that takes a PWM signal from an Arduino to control the output.
Week 3: Ball detection with HoughCircles(). Comparisons of the depth image (upper left) with the ball highlighted in red, all detected circles (lower middle), and the highest-confidence circle (upper right). It is clearly shown the ball is not detected as a circle, or as the highest confidence circle.
This week I implemented RANSAC for locating a sphere within the pointcloud produced by the Kinect. This runs faster than the Hough Circle Detection, which is a nice improvement, and does not detect false positives in the background , but still does not reliably detect the ball. As seen in the point cloud image below, the number of points on the ball itself is very small, less than 50 or 100 points are found on the ball. My guess is that
I will attempt to improve the detection by fiddling around with the settings for RANSAC, as well as gathering clean data from the test play environment in the lab room. With these in hand, RANSAC should hopefully find the ball.
The next step is to attempt general automatic point cloud segmentation, which should divide the person and the ball into two distinct segment groups, and then take the (x,y,z) location of the closest segment as the ball location, as nothing else would be between the robot and it's opponent.
If this does not work, then I will look at only detecting the opponent player (which is easier, via Kinect skeleton tracking or AR tags) and use this as a rough guide for estimating where the ball will land.
I started this week by redesigning our prototype to allow a variable distance between our flywheels. I mounted the rubber fly wheels and correct motors to our wooden prototype and with the help of my teammates, we got general control of the speed of the wheels. We were able to launch our beach ball from a flat trajectory approximately 20-25ft. This effectively served as our proof of concept that our launcher could accomplish our task. (see video below)
We then strapped our two prototypes together and added a proxy sensor so we have a demonstration of how our hopper and loading mechanism will work together.
Finally, I spent some time today redesigning our launcher implementing everything we've learned from the prototype (image below).
This week I have assembled an Arduino to control the robot and launch the ball. The positions of two potentiometers are read as analog voltages and mapped to PWM duty cycles by the Arduino. The PWM signals are output to two DC motor PWM controllers to control the speed of the motor. Interestingly, the default PWM frequency of the Arduino caused a loud wine (drawing a few complaints from nearby offices). To fix this, I was able to change the Arduino PWM frequency by making changes to the control registers. Going to a higher frequency fixed the wine but reduced the range the potentiometer was able to control the motors at. To fix this I shifted the PWM mapping so that 0-1023 on the potentiometers maps to 150-255 for the PWM duty cycle. The potentiometer inputs can be seen in a graph below.
The ball as seen by the Kinct, visualized in rviz. It is not recognized by the current RANSAC implementation, which is a roadblock toward our fully autonomous catching system.
This week I implemented a quick and dirty human tracking ROS node, done via AR tags and the ar_track_alvar ros library. This allows us to track the position of the playing human, and aim the ball at them. Improvements can be made to this implementation: as it is a straightforward positional aiming, it overshoots the person's position and will settle in oscillatory behavior, trying to center in on them eternally. To improve this I will add a magnitude to the velocity command to further slow it down when it gets close, and will add a dead zone of acceptable "close" angle measurement between the human and the robot.
Further work will focus slightly on tracking the ball mid-air, although this is on pause as the previous work was unsuccessful in applying any sophisticated algorithm to solve this. Binary segmentation to detect the ball will be tried next.
Lastly, I will continue to develop a simple POMDP model of how quickly after a robot throw and gently the ball is being thrown during the competitive stage. By estimating these two actions of the competitor (throw and force) the robot will attempt to judge when and how far back it should move as a response. This model will need to be small, and able to be rapidly built as there are only a handful of throws to gauge an opponent, and then a new opponent will appear, requiring a new model to be generated.
This week I took all mechanical design considerations from our prototype highlighting what did and didn't work. We kept the laser cut housing for the chamber, but redesigned the loading mechanism, motor mounts, fly wheel hubs, and framework. The revisions are as such:
I cut the wood, machined the frame and assembled the launcher early this week. We found that having a more massive and sturdy base accounted for much of the vibration problems we were dealing with early on.
I also built a frame for the netting out of Apollo Pex (fancy pvc) because it was more flexible than PVC and could easily be bent and connected using cheap connectors. We married the launcher and the net using bird netting for how lightweight it is. With the system all together, we were able to play our first successful game of catch (see video below):
This week was pretty tame on the electrical side of things. Early on I ordered some IR sensors that will be used as encoders for the flywheels, though I haven't had the time to use them yet since they just arrived.
The main work I did was lay out wiring in the new version of the launcher. My goal is for the wiring to be simple but easy to modify since we are still in active development. I could have done some cleanup to the breadboard wiring, but given that it works fine currently and will be dramatically simplified once ROS is in control, it was not worth the time to fully re-wire. The main changes I made were to the motor wiring - it should have a much more solid connection than before is is now taped out of the way so it is less likely to be damaged while moving.
I had to re-write some of the servo control logic since the servo direction is reversed on the new assembly and the range of motion is different with the new arm.
This week I have played with the tolerances on the positional commands sent to the pioneer base. They are not set in stone yet, but once I have free time next week I will sit down and figure out good values for the outer, middle, and inner tolerance levels. I am also considering adding a rotate-in-place command to execute when the robot cannot find the AR tag on the person.
Upcoming work will focus on the ROS-arduino bridge with Brian and Devon, to get the entire software stack working together. After that, we put the launcher on top of the pioneer and should be good to go.
This week I mounted and tested infrared reflective sensors on our launcher. I started by writing code to output what values the analog sensor was reading when facing the matte black of the wheel vs a bright piece of blue duct tape. The difference was considerable (~900 vs ~50) so I set a lofty threshold value of 500 to determine whether the sensor was seeing tape or not. I used that in a small logic loop that timed how long it took the wheel to rotate once, and calculated RPM. The video below shows a demonstration of the Arduino serial monitor reading out one wheel's rotational velocity.
Next steps for this is to get a second arduino (one to manage commands from ROS, one to act solely as the tachometer) and implement a P or PI velocity controller to manage wheel speed given from a higher level controller. From here we can start to develop throwing ranges and modes (ie cooperative or adversarial).
Out of town this week at the GFR competition. Good luck!
Video demonstrating the measured RPM based on the light sensor and a single reference point (blue tape) attached to the wheel.
Nothing this week, as I was on two papers submitted to conferences this week.
I started this week with the last bit of machining for our project. I turned 4 motor hubs that served to clamp the rubber disks in place rigidly while spinning. Due to a slight error in my drawings, I drilled the screw holes too large and there were no features that located the motor hub to the wheel hub. This caused our wheel to spin eccentrically with a ton of vibrations. I found a feature on the wheel hub to locate off of and remachined the wheel hubs. I then assembled the entire wheel, clamped it to a dowel rod and chocked the whole assembly into a lathe and machined it round. This in addition to some damping neoprene pads have effectively eliminated vibrations in our system.
Also this week I've taken a lot of steps towards improving the motor controller. I integrated the tachometer I made last week into a negative feedback loop that controls the left and right flywheels independently. There were a handful of challenges we ran into like noisy sensor readings destabilizing the controller. After a few hours of troubleshooting, we finally got a PI control system regulating the speed of the motors. Our current code is now poised to accept commands from ROS dictating speed.
Next, I made a final push to finish the mechanical design of our robot. Using wood I found in the lab, I made a platform for the launcher to sit on. I also used PVC to stand the platform about 2 inches above the pioneer allowing our laptop to slide in between. I designed and printed two angles that serve to fasten our launcher to the platform. Our robot is now mobile and ready to start testing cooperative and adversarial play modes.
This week I have been assisting Brian with tuning the motor speed controllers and separating control to two separate Arduinos. One Arduino is responsible for looking for a caught ball and running the launcher servo. The second Arduino is responsible for motor speed control. We decided using separate Arduinos would be better since it simplifies the code and wiring. Before there were a lot of unrelated functions taking CPU cycles. For example, running the servo would effectively stop motor speed control and we would lose some speed in the motors prior to launching. Having two also makes code changes easier. The two are connected to each other over a single wire acting as a digital bridge. This bridge is used to tell the launch control Arduino when the motors are at the desired speed. We do not want to launch before the motors are spun up.
The next step is to integrate the speed control Arduino with ROS so that it can tell it when to launch. We should be able to manage with just one serial bridge from ROS since we don't need very sophisticated timing controls for the launcher. Compared to the previous version of the launcher we have removed the speed control potentiometers and implemented serial control of the speed. Entering a desired RPM into the serial monitor will tell the Arduino to change the motor speed. ROS will be able to accomplish this for two independent motors.
Wiring layout with two Arduinos.
I focused on getting the ROS-Arduino connection running. As of writing, two-way communication is set up between the motor controller arduino and the laptop. However, due to (I believe) the pwm control functions taking a variable amount of time, the serial communication breaks down rapidly, and generally only one or two messages can get sent reliably. To relieve this, I will look at either regularizing the amount of time the motor control loop takes, or will see if the servo controller can handle the communication with the computer, and then relay the desired speeds over I2C to the motor controller board.
The numbers being communicated are quite simple, so I hope we can run them over I2C and not interfere with the motor controller board.
This week I hand tuned the PI controller so it will now approach desired speeds quickly and be game-ready faster. I also did some distance testing by increasing the wheel speed by 100 rpm increments and found the speed to distance function to be almost linear (see chart below). We will model its trajectory planner with this assumption. I also wired the motors directly into the pioneer motor power board so we can operate without being tethered to the wall.
I have also been putting effort into designing indication characteristics. I intend on having a drag race-style countdown using the LED recently installed.
At this point, all hardware is complete and can the group can focus on software.
My focus this week was on improving the wiring of the robot and making some refinements to the control wiring and code. Pin D5 on the motor control Arduino died so I moved the controllers over to pins D9 and D10. They only have half the PWM frequency but this is sufficient to eliminate motor whine. I also moved the "flywheel ready" wires over to pin D13 on each Arduino since they are connected to built-in LEDs on the Arduinos. This provides some additional feedback for troubleshooting.
Another change was the addition of RGB LED control on the launcher Arduino. This can provide the user with basic interaction during the launch sequence and is particularly useful in friendly rounds of competition. With some changes to the code or even adding ROS control we can further develop this human interaction in the future.
The last change is the addition of the ROS serial bridge to the motor speed Arduino. This will allow us to send the Arduino independent motor speeds from ROS versus through the serial monitor.
Below I have included an image of the most up-to-date wiring diagram.
Current system Wiring
Weeks 9 was the first of two competition weeks. Our robot won the seeding competition. Improvements we need to make before the next competition are:
The final competition is this week. Videos to come!