Project Overview

Primary Objectives:
1) Develop a mobile platform that a parrot can utilize for self transportation.
2) Develop a platform that can autonomously park.

Secondary Objectives:
1)  Prevent parrot from "escaping" from the platform.
2)  Provide feedback to the parrot in order to improve driving skills.

    Returning from sea can often be a precarious evolution for ships. Shoal water, navigational hazards, and outgoing traffic can cause a headache for any sailor. To assist the safe passage of ships in inland waters, range lights were created to guide ships safely through channels. The navigational aid consists of two lights facing in the direction of a maritime channel. The forward light is lower than the rear. Range lights indicate a ship’s position within a channel. When a ship is positioned in the center of a channel, the range lights will be vertically aligned. If a ship is left or right of the center, the rear light will appear left or right of the front light. While mostly used for nautical applications, range lights can be used for guiding ground vehicles too.

    Computer vision requires a great amount of computer resources and is processor intensive. One could then say that using a very fast computer would result in a shorter image processing time. However, fast computers require large amounts of energy. On a small robot, where power consumption is important, having a fast computer is usually not an option. Therefore, a compromise between processor speed and energy consumption must be made. The BeagleBoard-xM, with its ARM-A8 processor, can handle image processing while only consuming around two amps. To detect the docking station, the robot will require a camera. Creative Labs Creative Live! Chat HD web camera was chosen because of its image quality and low price. The camera is interfaced with the Beagleboard using a USB cable. For simplicity, the robot will use two wheels for propulsion and two castors for platform support. The web camera will be mounted to the robot through several servos allowing several degrees of movement for searching. Inspired by navigational range lights, the docking station will consist of two spherical lights painted neon green and neon orange. The green light will be in front of the orange at a lower height. Whereas the orange light will be in line with the green light but positioned higher.

    Developed by Intel in 1999, OPENCV has become very popular as computer vision software with over 2.5 million downloads.  It is a robust cross platform library that supports multiple languages.  Very powerful, OPENCV has been used to track objects, recognize faces and gestures, and simulate depth perception using multiple cameras.  In addition to the many applications of OPENCV, the software is open source and free.  For this experiment, a Python version of OPENCV will be used.

    In order to identify the base station, images from the camera will be converted from red, green, blue (RGB) to hue, saturation, and value (HSV).  Using HSV, identifying particular colors under different lighting environments is more consistent.  Once an object with the particular HSV values that matches the green sphere has been identified, the robot will conduct pattern recognition, blob detection, or both to determine if the object is the sphere.  Next, the same process will be utilized to determine the location of the orange sphere.  After both spheres have been identified, the horizontal position of the orange sphere relative to the green will be determined.  Based on the orientation, the robot will maintain a visual of the station and move in a way that will vertically align the spheres.  Once the spheres have been aligned, the robot will turn towards the docking station and proceed forward.