The specific contribution which is on the cover of Science Robotics, is concerned with the the mapping of the water column in this ecosystem, using robotic platforms. The water column is important because it is where the basis for the marine (and human) food web is situated. This basis arise from the fact that phytoplankton is produced in vast quantities here, based on nutrients and sunlight. The produced phytoplankton is eaten by zoo-plankton, which is eaten by fish, which is the main source of food for the birds. So mapping phytoplankton is important because it tells us what we are starting with (it is sort of the energy input from the sun gets transferred into eatable food), and if there is changes here, it will eventually be felt by the birds.
To get the right detail of this complex system, we needed to cover a range of different scales. So our engineer and science team collected data from large scale features using satellite and buoys, and combined them with more detailed and local data from ships, cameras, as well as autonomous robotic platforms. Specifically, we used a type of robot called an AUV, which stands for autonomous underwater vehicle. The AUV looks like a torpedo, but only it does only carry a payload of different types of environmental sensors (you can see pictures further down on this page). The AUV can move fast and can be programmed to “think for itself” using software algorithms that we developed, so that the system can make decisions based on what it observes during the mission. This way we are utilizing all the information that is available to us. I think it is important to note that mapping phytoplankton is very difficult, as it is very heterogeneously distributed; we call this heterogeneity "patchiness". The paper discusses a way to map this patchiness effectively in 3-dimensions using a statistical model of the plankton distribution, that updates during the mission, on which the subsequent data collection is planned on line inside the computer of the robot (usually referred to as adaptive sampling). The robot is not controlled, but makes these choices on line on its own (autonomy). Radiowaves can not propagate far under water, only a couple of meters depending on signal strength. So it has to be able to operate on its own (which is why they are called autonomous). We can, however, through acoustic follow its position and send simple commands such as "stop, come to the surface", "what is the distance from you to the boat".
The robots goal is to optimize its route to give the best map of the amount and distribution of phytoplankton back to the scientists, which are doing parallel sampling with other instruments on a nearby research vessel. One of these sensors is actually a camera that counts and detects individual plankton particles, telling us something about the creatures that live inside this patchiness.
Link to the paper here: http://robotics.sciencemag.org/content/4/27/eaav3041