System Design Table of Contents
The goal of the ShipBot project is to create a robot capable of performing various tasks aboard older boats and ships. It needs to be portable and avoid interference with existing crew. It’s total size may not exceed 1.5 feet (in width) x 1.5 feet (in depth) x 2.5 feet (in height). Prototyping can be done with wall power; however, the final product should have a dedicated power supply. Power tethers are permitted. The total project has a $1000 budget.
The robot will operate within a 3’ x 5’ testbed that consists of eight 1’-wide work stations. Our team will have 1 minute to place our robot and orient it to the testbed. From that point on, The Drunken Sailor must operate without any human intervention. Though the locations of the work stations are known, what tasks are available will change depending on the mission file. There are 4 types of tasks that the robot will have to perform: turning spigot valves, wheel valves, or shuttlecocks, and flipping breakers. Each task will be set to an arbitrary state and the robot will have to determine whether or not to interact with it based on the desired end state provided in the mission file. The mission file will also contain instructions as to which workstation to visit and a target completion time. All tasks will be roughly at a height of 18’’±2’’. A guide rail will surround the devices and leave them at a depth of roughly 6’’±2’’ from the operating area. The robot may use the guide rail for mobility guidance or physically attach to it during the setup phase.
The gate and spigot valves will be required to be turned to a target angle with an acceptable error of ±15o. Shuttlecocks will need to be rotated 90o to the open or closed position. Breakers are categorized by A or B types indicating the on direction. The robot may optimize the path it takes for time if our team specifies its route before the testing phase. The expectation otherwise is that the robot follows the order given in the mission file. The operating area may have an obtrusive section of pipe that the robot must navigate around. Additionally, the entire testbed will feature a rolling deck to simulate the rocking of a ship at sea.
It goes without saying that safety, as always, is paramount. The manufacturing process and the autonomous robot should not pose a threat to property or life. The robot should be well constructed, being both aesthetically pleasing and structurally sound. “Rat’s nest wiring” and duct-tape will all negatively affect reviews at the Design Expo and lead to overall less robustness of the system. The expectation is that the final prototype will resemble a completed portable system, so a laptop should not be the driving computer of the final system. Aside from these explicit requirements, the teaching team reserves the right to expand the list of requirements at any time.
While there are many clear criteria the ShipBot must achieve, there are a number of requirements that are inherent to the nature of the project. Maintaining academic integrity is as important as always. With multiple teams all working on the same project, there is a likelihood that similar mechanisms will appear amongst multiple designs; however, it is critical to cite our work and where we derive our inspiration from. We have to have proper documentation of our process and implementation.
Apart from moral expectations, there are additional logistics we must consider. The robot must be structurally stable as it performs its actions so as not to collapse under strain. The robot must be able to successfully manipulate objects around it in order to accomplish its goals. Humans should be able to intervene in case of error or emergency. We also have to complete the ShipBot within the span of the semester. With the semester being shorter than usual and with the first two weeks being virtual, we will have to be incredibly proactive in our design and implementation. Ordering and prototyping will have to be completed early so we have enough time to iterate on our design.
Speed - we tried to perform our tasks as quickly as possible so that we may complete the trials faster than the target times given.
Battery Power - We successfully implemented Zeee 8.4V 3000mAh NiMH Batteries to our electronics system so our robot would not need a power tether.
Aesthetics - we tried to make our robot more unique. Given that our team’s theme is pirates, we called our robot The Drunken Sailor and engraved pirate imagery into the front and back of the robot. We also adorned our robot in pirate duckies. An additional speaker allows The Drunken Sailor to hum sea shanties as it goes about its work, including the one of its namesake.
Mobile platform without electronics and gantry
The mobile platform is the main housing for all our systems. To prevent tipping and to keep the center of gravity close to the ground, we implemented a design that keeps the majority of our weight near the bottom of the robot. While there are many ways to locomote, we decided mecanum wheels will serve best for our needs. With the arrangement of the testbed, it would be very convenient to have the ability to translate without having to rotate the entire robot. Omni wheels may be a bit cheaper, but the extra weight of the mecanum wheels and the angle of the rollers will mitigate unwanted sliding, especially in a dynamic environment such as a rocking ship.
Gantry system mounted onto mobile platform
The gantry system provides a way for the robot to position the end effector in the appropriate location quickly and accurately. The system has two dimensions: X and Y, where X is across the width of the robot and Y is along the height. Each axis contains a stepper motor mated to a 300 mm lead screw that translates the end effector mounting plate via linear guide rails and bearings. Each gantry direction contains two guide rails and four bearings, two per rail, to ensure the stability of the end effector during all movements. The robot starts with the gantry at a home position of (0,0), which is set by the SKR board. When the stepper motor drivers receive a command, the motors spin the lead screws and send the end effector mounting plate to the desired coordinates.
End effector assembly.
The end effector for our robot is broken down into two parts. There is the part interfacing aspect and the wrist joint aspect. The part interfacing aspect is how our end effector will engage with the various ship components on the testbed. There are a few different types of valves and circuit breaker switches and our end effector was designed so it is compatible with all of these different components. The other part of the end effector is the wrist joint. Some valves on the testbed are oriented upwards and some are oriented towards. Therefore the end effector can rotate 90 degrees to interact with both of these valve orientations; this is why this part of the end effector is referred to as the “wrist joint.”
In the horizontal orientation, the gantry X and Y directions can maneuver the end effector to rotate valves and flip circuit breakers. However, in the vertical orientation, the end effector loses a degree of freedom that contributes to the rotating motion. To solve this issue, the end effector also has a built-in stepper motor that moves the support arm in and out in the Z direction. This movement, in addition to the X movement of the gantry system, is how the end effector can rotate valves mounted in the vertical orientation on the ShipBot testbed. Due to this simple but effective design for the end effector, our robot can interface with all the testbed components. To change the state of these switches and valves the robot simply maneuvers the two-dimensional gantry system and the Z stepper motor once the end effector's finger is properly engaged. A picture of the end effector CAD model is shown below in Figure 7.
The electronics and firmware consist of all electrical components and code that is running within our system. This subsystem requires a thorough knowledge of programming with microcontrollers, signal processing of sensors, and motor control. These are all skills that our team members have developed from previous coursework and refined in the labs during the first few weeks of the course. The microcontrollers run C++ and Python code that performs many tasks related to sensing and actuation.
Our electronics and firmware can be broken into three main subsystems. There is the sensor processing subsystem, which interfaces with sensors (namely the RealSense camera). Based on the sensor data, this subsystem then sends commands to the two actuation subsystems. The first actuation subsystem deals with locomotion, which receives commands from the central hub and then drives motors to move the robot. The second actuation subsystem deals with manipulation, which receives commands from the central hub and then directs the robot’s arm to a specific location in space.
Our robot has four different processors, each used to handle a specific purpose:
A Jetson Nano functions as the central hub. It reads and processes sensor data, as well as sends commands to the other two processors. The Jetson Nano uses PyTorch, an ML framework, in order to handle image processing coming from the Intel RealSense camera.
An Arduino Mega is used for motor control in order to move the robot around the testbed. This device receives commands from the Jetson Nano in order to drive the motors, guiding the robot to a specific location. The Mega can run at 16 MHz and has a large number of PWM and interrupt pins, which is necessary to interface with our four DC motors, which each have encoder feedback.
An Arduino Uno drives the linear actuators, extending and retracting the end-effector as desired. The Uno is a simple processor that runs at 16 MHz, but fortunately we really don't need much processing power - we will just simply be telling the linear actuators to extend fully or retract completely. The Uno also has a compact motor shield that we use to drive the actuators. It receives commands from the Jetson.
An SKR 2.0 board allows us to use G-code, a computer numerical control programming language. This board has stepper motor drivers on it; its design allows us to easily use G-code to control the stepper motors as desired. Commands come from the Jetson Nano to drive the motors as needed.
Electronics wiring diagram.
Neural network detecting valve.
This subsystem includes everything from the high-level interpretation of the command files down to the drive-kinematics sent to the motors. This involves working with the data from the stereo camera and possibly other sensors to both localize and identify the valves/breakers in front of the robot, and coordinate between all the different subsystems, such as the central hub, manipulator, and drive subsystems. Especially with the low-level code, this overlaps with the electronics/firmware subsystem, in that the kinematics/feedback control loop will need to be implemented in firmware on the microcontrollers themselves.