(a) Discuss your results. How well did your finished solution meet your design criteria?
We achieved both our desired goals, Baxter successfully strums the UkƐ after the chord is ready and the audio analyzer detects a strum to shift to the next chord. Our solution met our design criteria and by our final submission we were able to mount the UkƐ on Baxter to strum as if he were playing it as a human. We missed our dream goal of detecting the UkƐ and carrying out a strum. Even though we could successfully detect the UkƐ with AR Tags and Baxter's cameras, move-it was unable to plan an accurate strum motion due to the tight spatial and orientation constraints.
(b) Did you encounter any particular difficulties?
We faced difficulties using Baxter cameras and had to reset them as well as the robot multiple times. However, after hours of debugging, we found a way of booting up the cameras at the beginning of our work session so they were reliable. Another challenge we faced was using path planner and move-it due to the tight spatial constraints. The body of the UkƐ is right next to the goal state of a strum and path planning was unreliable, failing or timing out. Finally, while connecting Baxter to the chord playing robot we faced a challenge of connecting ROS with the ESP32 Dev Module. However, we were able to work around it by using an Arduino UNO as a relay between the two systems.
(c) Does your solution have any flaws or hacks? What improvements would you make if you had additional time?
Our solution is unreliable in detecting a strum at times due to the vibrations between the UkƐ and Baxter. However, we can improve that by adding a dampener (eg: foam) below the UkƐ or tuning the Audio Analyzer on the frequency of individual chords.