Bird, Eric & Boyd, Josh & Payne, Tye & Hernandez, Luis & Golightly, Kylie. (2025). Development of an Extended Reality Simulator for Small UAS Full Mission Training. 10.2514/6.2025-3410.
Small Unmanned Aircraft Systems (sUAS) have become ubiquitous throughout numerous industries. Due to this, there is a fast-growing demand for pilots to operate these aircraft. Flight simulation of the sUAS is sometimes used to help with training pilots to fly these aircraft, but the current levels of sUAS simulators are woefully inadequate. This paper addresses the development of a robust sUAS flight simulator that employs Extended Reality to provide an immersive training environment for robust full mission training regimens. The initial implementation, which is already in production, provides the ability for pilots to train on both multicopter and fixed wing sUAS. In addition, the flight simulator has the capability for pilots to perform all functions of a flight mission, from flying the aircraft to communication with the mission team. Multiple crew stations allow for multiple aircraft to be flown simultaneously by different pilots. Future work on the simulator will be focused on improving the fidelity and capabilities of the flight simulator.
Phadke, Abhishek & Boyd, Josh & Medrano, F. & Starek, Michael. (2023). Navigating the skies: examining the FAA's remote identification rule for unmanned aircraft systems. Drone Systems and Applications. 11. 1-4. 10.1139/dsa-2023-0029.
As technology and innovations in unmanned aerial vehicles progress, so does the need for regulations in place to create safe and controlled flying scenarios. The Federal Aviation Administration (FAA) is a governing body under the United States Department of Transportation that is responsible for a wide range of regulatory activities related to the United States airspace. In a recently published final rule, the FAA addresses several concerns such as the need for a system to identify all aircrafts flying in national airspace, as well as the implementation of a separate system from the prevalent Automatic Dependent Surveillance–Broadcast system to prevent interference with manned aircrafts. Their solution to these concerns is the deployment of remote identification (RID) on all unmanned aircraft systems (UAS) flying under its implied jurisdiction. While US governing agencies retain the use of the word UAS for now, the International Civil Aviation Organization terminology is remotely piloted aircraft systems. The FAA describes the RID implementation as a “ Digital license plate” for all UAS flying in the United States airspace. They outline additional policies including several options for compliance, operating rules, and design and production guidelines for manufacturers. As the September 2023 deadline for compliance draws near, this article highlights possible deployment applications and challenges.
Navigating the skies: examining the FAA's remote identification rule for unmanned aircraft systems
Josh Boyd
December 2021
at Texas A&M University-Corpus Christi
As unmanned aerial systems (UAS) become more prolific so will the use of radar systems for tracking UAS in the national airspace system (NAS). The future of Urban Air Mobility (UAM) involves large amounts of UAS operating autonomously and simultaneously in urban environments for the purpose of passenger or cargo transportation. Radar detection of UAS in an urban environment can be hindered by line of sight (LOS) blockage by large buildings thus necessitating many surveillance devices to gain full coverage. Currently, UAM is still in development by the Federal Aviation Administration (FAA) and other airspace partners and many cities do not have the need or resources for full radar coverage. Due to the high cost of individual radar systems and the quantity needed to cover urban areas it is currently not practical to have full radar coverage of an area at all times. Permanent stationary radar systems are generally calibrated once with occasional adjustments and low time constraints. Temporary radar systems must be calibrated and aligned before each mission deployment and often under short time constraints. Temporarily stationed mobile radar platforms will be utilized for specific targeted mission objectives until a more permanent solution is developed and implemented. In the case of disaster response or search and rescue, a temporary radar system needs to be quickly deployed. The key abilities required by a temporary radar system are accurate track position reporting and quick setup and breakdown. One of the bottlenecks to quick setup is heading calibration. Radar antenna alignment is crucial to the performance of the system and its ability to accurately determine the position of a tracked object.
In this paper, we implement and compare multiple methods of radar heading calibration for accuracy and speed including manually with a handheld compass, manually with a web-based heading helper tool, manually with a custom dual Real Time Kinetic (RTK) GPS alignment tool, and automated with a collaborating Radar Cross Section (RCS) device. For RCS devices we use a marine radar reflector and attached RTK GPS when unable to fly and an unmanned aerial vehicle (UAV) also with RTK GPS when able to fly. By leveraging our experience working with UAVs and Radars we show a method to auto-calibrate the positioning sensors by using multi-sensor fusion and collaborating participants, thus reducing the amount of setup time, and increasing the accuracy of the system.
Automated Radar Heading Calibration with Collaborating Participants and Multi-Sensor Fusion
Related articles All 2 versions
Symposium for Student Innovation, Research, and Creative Activities
Josh Boyd – Presenter
April 25, 2025
at Texas A&M University-Corpus Christi
Decoupled Architecture for Testing UAV Swarm Autonomy: A Simulation-Based Approach
Josh Boyd1,2,4, Dr. Jose Baca2, Dr. Michael Starek 3, & Dr. Tianxing Chu3
1 Department of Computing Sciences, College of Engineering, Texas A&M University – Corpus Christi; 2 Department of Engineering, College of Engineering, Texas A&M University – Corpus Christi; 3 Department of Computing Sciences, College of Engineering, and Conrad Blucher Institute for Surveying & Science, 4 Research and Innovation, Autonomy Research Institute, Texas A&M University – Corpus Christi
Autonomous swarm robotics research requires robust and flexible testing frameworks that integrate both simulated and real-world Unmanned Aerial Vehicles (UAVs). This work presents a Decoupled Architecture for Testing Swarm Autonomy, a modular framework designed to support interoperability, scalability, and flexibility in autonomy testing. The architecture consists of interchangeable software components, including a controller thread that operates drones (e.g., Crazyflie 2.1, etc.) via a custom Application Programming Interface (API), which can be swapped with a MAVLink-based controller without altering higher-level logic.
Communication modules facilitate interactions through sockets or Artemis message queues, while command input modules support manual control via keyboard, autonomous path planning scripts, and future integration with motion capture-based gesture inputs. The system supports multiple data protocols, including JSON, Protobuf, and MAVLink, enabling seamless interaction between live and simulated agents. A key advantage of this architecture is its ability to provide a common operating environment for distributed UAV control, allowing for Live, Virtual, and Constructive (LVC) testing by mixing live drones with virtual autonomous agents and human-piloted simulations.
The proposed approach enhances scalability and adaptability for swarm autonomy testing, enabling researchers to evaluate heterogeneous UAV systems under diverse operational conditions. Future work will focus on integrating real-time gesture-based control using camera vision, control using motion capture and refining autonomous behaviors for swarms in complex, dynamic environments. This framework serves as a foundation for distributed UAV autonomy research, accelerating the development and validation of multi-agent coordination strategies in both physical and virtual settings.
Keywords: UAV, swarm autonomy, simulation, decoupled architecture, LVC, MAVLink
Symposium for Student Innovation, Research, and Creative Activities
Josh Boyd – Presenter
April 26, 2024
at Texas A&M University-Corpus Christi
Coordination and Autonomy in UAV Swarms for Array Imaging
Josh Boyd1,2,4, Dr. Jose Baca2, Dr. Michael Starek 3, & Dr. Tianxing Chu3
1 Department of Computing Sciences, College of Engineering, Texas A&M University – Corpus Christi; 2 Department of Engineering, College of Engineering, Texas A&M University – Corpus Christi; 3 Department of Computing Sciences, College of Engineering, and Conrad Blucher Institute for Surveying & Science, 4 Research and Innovation, Lone Star UAS Center, Texas A&M University – Corpus Christi
Satellites offer invaluable imagery with a spatial resolution of 30cm/pixel, aiding in the observation of vast geospatial features. However, for more intricate analyses such as measuring pollutants or the shape of smaller waves, higher resolution images are indispensable. Utilizing Unmanned Aerial Vehicles (UAVs) equipped with cameras can achieve this, capturing images with resolutions up to 1cm/pixel. Merging such images presents challenges, particularly in dynamic coastal environments with continuous wave movement and gusts of wind. Addressing this, the research aims to devise a strategy for controlling a UAV swarm to navigate and capture synchronized imagery. Three primary objectives are proposed: 1) employing flocking techniques to enable the UAV swarm to maintain formation during flight, 2) developing and implementing a synchronized image capture technique using multiple UAVs to enhance image detail over a large area, and 3) incorporating image geolocation utilizing GPS, IMU, and Remote ID (RID) from each UAV. RID, mandated by the Federal Aviation Administration (FAA) effective March 16, 2024, ensures safe flight operations by broadcasting location and identification information. Integrating these considerations into the research is crucial for advancing UAV swarm capabilities in capturing high-resolution imagery for various applications.
Keywords: Aerial Photography, Formation Flight, Image Synchronization, Drone Swarm
Symposium for Student Innovation, Research, and Creative Activities
Josh Boyd – Presenter
April 21, 2023
at Texas A&M University-Corpus Christi
UAV Swarm-based Synchronous Array Imaging for Coastal Environments
Josh Boyd1,2,4, Dr. Jose Baca2 & Dr. Michael Starek 3
1 Department of Computing Sciences, College of Engineering, Texas A&M University – Corpus Christi; 2 Department of Engineering, College of Engineering, Texas A&M University – Corpus Christi; 3 Department of Computing Sciences, College of Engineering, and Conrad Blucher Institute for Surveying & Science, 4 Research and Innovation, Lone Star UAS Center, Texas A&M University – Corpus Christi
Satellites can provide images with a spatial resolution of 30cm/pixel that cover large remote areas that allow us to observe large geospatial features. However, if a deeper analysis is required (e.g., measuring pollutants such as trash, chemicals, oil spills, shape of smaller waves, etc.), then higher resolution images are required. Nowadays, capturing high-resolution images can be achieved by getting closer to the area of interest with better equipment. One option is to use an Unmanned Aerial Vehicle (UAV) equipped with cameras to fly over the area and take a series of pictures with spatial resolutions up to 1cm/pixel. Thereafter, images can be unified via software to display one single picture containing better information. The outcome is good if the area presents a static environment (i.e., no movements). Merging imagery from Coastal environments is challenging due to the continuous movement of ocean waves and gusts of wind. A team of UAVs could capture a synchronized array of images and cover a larger area in a fraction of the time it would take for a single UAV. This research objective attempts to develop a strategy for controlling a UAV swarm to navigate a specific area and capture a synchronized array of images. Three overarching goals are proposed: 1) allow a UAV swarm to fly and keep a formation using flocking techniques, 2) increase image detail over a large area by developing and implementing a synchronized image capture technique with multiple UAVs, and 3) provide image geolocation by utilizing the GPS and IMU, along with the Remote ID (RID) from each UAV. Remote ID (RID) enables a UAV to broadcast location and identification information for safe flight operations. The Federal Aviation Administration (FAA) requires RID capability for UAVs, effective September 16, 2023, making it necessary to combine these considerations into new research.
Keywords: Aerial Photography, Formation Flight, Image Synchronization, Drone Swarm
Symposium for Student Innovation, Research, and Creative Activities
Josh Boyd – Judge
April 25, 2024
at Texas A&M University-Corpus Christi
Symposium for Student Innovation, Research, and Creative Activities
Josh Boyd – Judge
April 26, 2024
at Texas A&M University-Corpus Christi
Coastal Bend Regional Science Fair (CBRSF)
Josh Boyd - Judge
Feb 9, 2024
at Texas A&M University-Corpus Christi