We track the location of the ball's position with a camera, which is later processed in Python to obtain the coordinates. The coordinates are fed to our PID controller written in Arduino to obtain the yaw and pitch angles of the platform. These angles are then converted to servo angles via inverse kinematics. The motors drive the Stewart platform, which moves the ball to a new location, and that location is again captured by our camera and processed in the next cycle.
The detailed flow of operations is described below in the software flowchart. We started with processing a 3D image frame and using the function HoughCircles in the OpenCV API to target the location of the circle. The location returned will be in the image frame, so we need an additional step to transform the coordinates into the platform frame to work with. Once we get the coordinates in the platform frame, we can take its difference with the reference and use our PID controller to obtain the target platform rotation angles. The last step is to map the platform rotation angles to the corresponding motor angles. We implemented a similar triangle approximation algorithm to obtain the angles, and they are later used to drive the motors.
We designed our software to be two-part: ball-tracking and controlling. They run on two separate systems, a laptop and an Arduino. Ball_Track.py interfaces with the camera and does all the complex matrix multiplications for transforming coordinates using Numpy. bbc2.ino is mainly in charge of real-time controlling, including running the PID controller, computing motor angles from Kinematic equations, and actuating the motors with the Servo.h library. The two systems communicate via a USB serial port.
Electrical-wise, our system consists of a laptop, a camera an Arduino, three motors, and a power supply. The camera streams data of image frames to the laptop, the laptop and the Arduino communicate via USB, and the motors are driven by the power supply of 6V and are controlled by the Arduino from its analog PWM outputs.