Nelson: I thought that the most fun part of this project is seeing the robot move as you command it to, as the implicit meaning is that you've actually conquered understanding some crucial parts of the project. Besides that, I enjoyed learning about the ROS2 framework (despite its relatively poor documentation in my opinion), and only wish we could have worked a bit more explicitly on the framework. I liked the labs, and thought that in the end they worked ok, although I think that having the labs due a day after they start is a bit unreasonable given that some of them are quite time-consuming. I did think that some of the Thursday discussions could have been reframed as out-of-class readings to give more time to work on the project. In addition, I thought that due to the poor documentation of several parts of the project, I feel it would be very nice to have a bit of starter code to begin with; I feel this would save many hours of pain. I didn't find our team using the Robot queue on the main page of the website; mostly we just negotiated short periods of time with other teams on the spot. It would have been nice to have two robots to use, however, as that would both have made demand easier to deal with and also ensure that one mechanical fault didn't stall everyone's projects at the same time. I don't know if this is possible, but if the CSE department could provide quality video editing software for us to use, that would also have been very nice.
Raghav:
It was fun to work as a cohesive team: from crowding around monitors in class to spending back to back late nights together in the lab (that too well after midnight). Close to demo day, all 5 of us were in the same room laser focused on robotics/computer vision and having a good time. I learned a lot from my peers given their respective strengths/areas of focus. I was exposed to different ML frameworks and workflows (through Kevin and Aung). I picked up useful terminal commands fom Alex, a Linux maestro.
To be honest, while implementing custom arm actions, IK, and robot movement, I didn't have a great experience with the ROS2 API. The documentation was lacking a lot of example code. Most of my time was honestly spend browsing through the online Github ROS repos or source code of different ROS2 exposed modules to figure out how to use ROS2 code to do stuff. Often, what we wanted to do felt simple and could be accomplished in a few lines of code, but the hard part was finding what to call/subscribe to/etc. At the start, it took a while to even figure out how to read joint/limb positions (The documentation's example HelloNode class exposes joint state info to read (ex. limb positions) but it doesn't update that over time if the limbs move. It took a while to find the ros2 subscribe call in some random folder of the ROS/Stretch Github repo that makes the magic happen)
Towards the end of the project, we relied more on writing custom logic built off the base ROS2 API rather than using "extra" ROS2 API features. That decision saved us a lot of time, and gave us lots of custom control for doing exactly what we wanted.
If I could go back in time and give myself a simple 2 page documentation about the ROS2 and Stretch API (key pieces of ROS2 code + their usecase), I reckon beginner me would have learned how to implement custom arm actions, IK, and base movement like 4-5x faster. I think future quarters would massively benefit from more example code, and encouraging people during class to really share (across teams) what they've been discovering/learning about the ROS2 API over time.
Aung: The most fun thing about this Robotics capstone was the research phase where we would get to see what we could potentially do with our robot. I personally enjoyed the "openness" of it and the ability for us to choose our own "path" in this capstone pertaining to robotics and not really be hamstrung. Making the video was also really fun and I think it might be even bettter next time to have a creativity guidelines for it to bring out more creativity from the teams. I think the most useful thing about this Capstone was all the labs we could have as reference in case there were problems in case anything went wrong. The course staff were also incredibly helpful in providing support for us. What was not as useful for me personally in this capstone was the amount of time we spent on labs. It felt a little rushed at the end to have to come up with the entire implementation of the robot in the last 3 weeks especially with other classes. Having more time to work on the actual implementation would've been more nice, but it wouldve meant a more crammed weeks 1-5. I think what wouldve been useful but was missing is to have one robot per team. I know these robots are pretty expensive and its a big ask, but there was a lot of time idling sometimes in the last 3 weeks because another team was using it. Also, I think some sort of staff made lab that makes students complete boilerplate code from vision - navigation - manipulation wouldve been better than running pre-made code.
Alexander: The most fun part of this quarter was seeing bits and pieces of the project come together and slowly get closer to what we were imagining. The most useful part of this quarter was when we were done with labs and could get into the nitty gritty parts of building up our project. Something that was not so useful was that it took quite a while before we were completely done with labs and could fully focus on the project itself. Something that would have been useful but was missing was more detailed documentation about ways to use/interface with the stretch robot's existing code, since it was at times difficult to figure out what certain topics/services/etc were for and what could help us do what we want.
Kevin: The most fun part of this quarter was the late-night memories of us working until 6 AM trying to figure out how to integrate the vision, pathfinding, and manipulation aspects together that we had worked separately on during the quarter. I also really enjoyed the fact that this class allowed us to implement almost basically whatever we wanted for our robot rather than being hamstrung to a standard project selection. It was also a fun time recording the videos (especially the final project video). The most useful thing in this lab was that we could use the labs as a guided reference/tutorial on aspects of the robot that we needed to implement. However, the not useful thing was that these labs took up way too much of the quarter, leaving less time to start actually writing practical project code for our robot. This forced us to rush at the end of the quarter to get a working final product which was difficult while balancing other classes final project commitments. Lastly, what would have been useful but was missing is that the Stretch Robot's existing code documentation. We found the documentation very lacking which made it more difficult to figure out what each topic/services did and if they were already pre-written and could be integrated into the things we needed.