Our team needs your support
Brand new for the 2025-2026 school year is the ASV Superior. The ASV Superior is named after the Great Lake, Lake Superior, which Lake Superior State University is named after. With an all-new design that includes a custom hull, new PCB, and advanced vision training, the ASV Superior is ready to tackle what the waves throw at it.
The Mechanical Team has been hard at work creating a brand new vessel-this time, from the ground up.
The ASV Superior was designed, fabricated, and kitted out by Team AMORE with the help of our community and colleagues we met at our competitions.
Specifications:
Overall length: 1.75 meters
4 thrusters total
2 Hawk Hobby main propulsion thrusters
2 Blue Robotics T200 bow/stern control thrusters
CNC machined polystyrene internal structure
Fiberglass shell
Comparison:
Sarda-ine II
Off-the-shelf kayak hull
Only external mounting
Shallow freeboard [distance between waterline and deck]
Under actuated, no independent forward and turning thrust
Superior
Custom design and fabrication
Large in-hull mounting volume
Sufficient freeboard to protect deck-mounted equipment
Over actuated, fully independent forward and turning thrust
The electrical system used for our new ASV Superior vessel is split into three separate systems; power distribution, GNC (Guidance, Navigation, and Control), and the vision subsystem.
The power distribution system consists of 24V batteries to power the thrusters, 12V batteries which power both the GNC/vision processors and a custom-made BMS (Battery Monitoring System), and a 24V-to-5V converter to power some lower-power GNC devices. This system also includes the emergency stop button on the vessel hull and a relay system which only allows power to flow from the batteries if the GNC system detects a connection with the vessel’s remote control on land.
The GNC system contains three different processors, each with its own purpose and importance, as well as an array of sensors, such that the vessel can get a clear view of its surroundings and autonomously complete tasks. The most necessary of the three, and the least powerful, is the Teensy microcontroller, which is wired to the thrusters’ speed controllers and allows the vessel to move. The Teensy takes input from either the remote control on land in manual mode, or when in autonomous mode, the second computer in the GNC system, an NVIDIA Jetson Orin AGX. The AGX uses our software to create the path for the vessel after receiving data from the vision subsystem. The third processor, a Pixhawk 6c flight controller, provides GPS data to the AGX using a GPS sensor mounted on the hull.
Our vision subsystem consists of two sensors; a USB camera for basic vision, and a LiDAR sensor for distance data. These sensors’ data runs into the vision system’s own dedicated processor, a second Jetson Orin AGX, which detects objects in the raw vision data and communicates their location to the GNC system.
ASV Software Architecture Overview
For the 2026 season, AMORE’s ASV software architecture was redone to reduce future low-level development to promote dedication to high level autonomy, perception, and path planning. This transition was due to AMORE's collaborations with Aquairyx ROSBOAT.
ROSBOAT now serves as the foundation for all low-level control and hardware management within a ROS 2 based system, allowing the team to focus on higher level autonomy such as mission logic, perception, and navigation rather than hardware abstraction.
ROSBOAT Low-Level Control Package
ROSBOAT manages all core low-level software responsibilities within the ROS 2 ecosystem. It handles system initialization, communication with embedded controllers, execution of control modes, and generation of thrust commands using built-in auto-tuned PID controllers. It also provides a real time graphical interface for monitoring system state and incoming data. By offloading these responsibilities, the autonomy stack remains cleanly separated from hardware concerns, improving reliability and maintainability.
Perception Subsystem
Sensor data is fused to produce a two-dimensional, grid-based map of the surrounding environment. Objects detected by LiDAR and AI vision are encoded as integer identifiers based on class (e.g., buoy color or marker type), allowing downstream systems to reason efficiently about both object type and location as seen in image 1.
AI Vision and Object Classification
Earlier implementations using YOLO-based models were limited by hardware constraints and inconsistent perception performance. Recent research conducted by AMORE identified RF-DETR as the most suitable object classification model for the ASV, outperforming newer YOLO variants in accuracy and robustness.
Fused AI vision and filtered LiDAR data allow the ASV to reliably classify and localize objects such as buoys, docks, and navigational markers as seen in image 3. This information is passed directly to the path planning and mission control subsystems, enabling informed decision-making during autonomous operation.
Mission Controller and Path Planning
The mission controller acts as the top level decision making module for autonomous operation. Before autonomy is engaged, the operator selects the desired mission. Once the system confirms a safe operational state, the mission controller coordinates perception, planning, and control to achieve the selected objective.
Path planning is performed on the grid-based map generated by the perception subsystem. To support navigation in dynamic maritime environments, AMORE selected a graph-based D* path planning algorithm as seen in image 2. Unlike Dijkstra’s or A*, D* efficiently replans routes as the environment changes, making it well-suited for scenarios where buoys, vessels, and obstacles may shift due to wind, waves, or current.
The mission controller determines which objects must be navigated toward, avoided, or passed through based on task-specific rules. It then provides goal points to the path planner, which computes a safe and efficient route and passes it to the ROSBOAT-controlled low-level system for execution.