ME 326 - Collaborative Robotics
Final Group Project
Group 2: Alex Qiu, Jack Ren, Jiaqi Shao, Naixiang Gao, Rohan Punamiya
Collaborative Robotics | Stanford Graduate Mechanical Engineering
Group 2: Alex Qiu, Jack Ren, Jiaqi Shao, Naixiang Gao, Rohan Punamiya
The goal of the project was to make a mobile robot an effective collaborator and assistant for a human counterpart, with the Trossen Robotics Locobot platform (consisting of a mobile base, 6 degrees of freedom arm, and a head-mounted camera). By using audio to listen to natural language commands by a human, and computer vision for robot perception of the environment, we designed an autonomous robot which carries out various requested pick-and-place tasks.
Teams will use language to request the object bring a desired object back to a particular location (e.g. “locate the apple in the scene and retrieve it”). The robot must interpret this verbal command, scan the scene, locate the apple in a cluttered environment, navigate to the apple, then pick it with the manipulator and return to the starting position.
Teams use natural language to request a series of actions be performed in the environment. For instance, for cleaning, the user may ask that the object locate an object (e.g. a red apple) and place it in a brown basket located in the scene: “find the red apple and place it in the basket".
In order to help users separate and organize laundry, our team built a system capable of autonomously separating clothes by type (e.g. socks and shirts). This task leverages visual recognition algorithms to identify clothes through their size and shape, then picks and places each clothing item based on category.