S.E.A.L. Team 6
Speech Enabled Autonomous Liquid-handler
Speech Enabled Autonomous Liquid-handler
Project Overview
Meet S.E.A.L., a voice-commanded mobile robot that listens, looks, and acts. Built on the bimanual TidyBot++ platform, S.E.A.L. responds to natural language commands to find objects in a cluttered environment, navigate to them, and manipulate them, including retrieving items, placing them in a container, and pouring you a drink.
What S.E.A.L. Can Do
Operation Banana Depot (Tasks 1 & 2)
S.E.A.L. listens for a spoken command, scans the room using a YOLO vision + Gemini pipeline, navigates to the target object using a visual-servo state machine, and either retrieves it back to start or places it precisely in a bowl.
Operation Bottoms Up (Task 3)
S.E.A.L.'s group-designed challenge: identify a bottle in the scene, grasp it, and pour its contents into a cup, combining perception, bimanual coordination, and liquid-handling control.