This video demonstrates the robot navigating to the user who needs ointment application in the context of our project using the map localization we created in week 6 labs. For our project navigation plans, since navigation is not the main focus of our project, we will focus on the other parts of the project first, per our proposal feedback. Then, if time, we will utilize the functionality we created in week 6 labs to have the robot navigate to the user in an autonomous fashion.
Ointment application node will be implemented by Andrew, as he is knowledgeable with the hardware we are using for it.
The arm movement node will be implemented by Afifah, as she originally focused on saving the arm movement poses of our week 5 lab's programming by demonstration.
The start and stop node will be implemented by Heer, as she is interested in general robot navigation.
The GUI will be implemented by Simran, as she is our User Interface lead.
As always, we will collaborate heavily to ensure that our nodes can support the service request/response communication with ROS bridge for a seamless GUI experience.
For the start and stop node, the navigation to and from user, particularly when navigation involves moving past/around obstacles, will be part of the stretch goal. Otherwise, all of the nodes oversee essential functionality and are hence a part of our minimal viable product.