Lessons learnt from Beyond the Horizon: a remotely operated field campaign
– By Muriel Dunn and Martin Ludvigsen –
During the first week of June 2024, the field group (3 ocean-going robots and 3 engineers) set off to the beautiful Mausund field station (off the coast of Norway, Figure 1) for a week of field work. The rest of us, the control room group (students, researchers and vehicle operators), stayed behind in Trondheim and spent the week surrounded by screens in the control room of the Trondheim Biological Station (Figure 2).
The main project goal was to test methods to detect swarms of Calanus, a zooplankton species with an emerging lower trophic level fishery, as part of the SFI Harvest, a Center for Research-based innovation. Calanus swarms in dense aggregations that can be seen from space and they present a large biomass of interest for harvesting. However, traditional methods for fishing are not compatible for locating and harvesting the patches of Calanus. As part of the SFI Harvest Characterization research objective, we wanted to develop survey technologies for autonomous detection of zooplankton swarms using active acoustics and characterization of the species to identify if it is Calanus using optics (SINTEF Silhouette camera).
Another important goal for the campaign was to push the limits of the autonomous and remotely-operated ocean-going robots by making them collaborate with each other beyond the line of sigh of the pilots. The purpose of collaborative robotics is enable complementary data collection strategies. Connecting the vehicle to a common virtual communication network, the vehicles could be operated jointly from a remote control room enhancing our ability to let the robotics collaborate while improved coordination between the operators increases the efficiency of the operations. By running a remote control room discipline experts could be included in real time without the logistics of bringing them to the site.
The three robots used for the missions were Grethe, Thor and the Autonaut. Grethe is an uncrewed surface vehicle (USV) with an electric engine (Mariner by Maritime Robotics, https://www.maritimerobotics.com/mariner) equipped with echosounders (Kongsberg Discovery AS, www.kongsberg.com), AIS and cameras. Thor is a light autonomous underwater vehicle (L-AUV; by OceanScan, https://www.oceanscan-mst.com/) equipped with CTD, echosounder and an optical camera. The Autonaut is an autonomous surface vehicle (ASV), powered by wave and solar power (Autonaut by MOST, https://autonaut.itk.ntnu.no/doku.php)
We started the field campaign with mobilisation (field group, the robots; control room group, the monitoring and communication systems) and modular testing of robots and sensor functionality and remote operations. During this period, we worked on getting used to piloting the robots “beyond the horizon, i.e. beyond the line of sight of the pilot. The engineers were onsite to deploy, recover and monitor that all the robots were operating as they should and the piloting was done from the comfort of the control room in Trondheim.
The crux of the campaign was the collaborative missions between Grethe, the uncrewed boat and Tor, the underwater vehicle (Figure 3). Grethe scanned the study region with the echosounder for areas with high zooplankton abundance. An artificial intelligence algorithm developed from a clustering technique was running in real-time and assessing the data for a near-surface zooplankton aggregation in the echosounder data. Once Grethe detected a zooplankton aggregation in the acoustic data it sent the location and depth to Tor, who was waiting in the study region. Once the location and depth of interest was received by Tor, it dove and traveled to the GPS point without interference from the pilots. Tor collected images of zooplankton within the zooplankton aggregation to identify and size the individual zooplankton within the layers.
Building on this innovative robotic collaboration, we can leverage strengths of autonomous vehicles and platforms as well as their sensors to get a better understanding of the oceans and their ecosystems. In this case, we hope to develop better methods for stock assessments of lower trophic level species and a more sustainable fishery.
Finally, here are 5 lessons learnt during the Beyond the Horizon field campaign:
- Check the sensors.
Working, collecting and in good state. It is easy to get focused on the robots and their deployment and needs because of the complexity and level of detail required but collecting high quality data from the sensors is needed for results and analysis. - Trust the robots and the field crew.
The robots are built to be robust and need to be allowed to be used to their full potential. We have to accept some level of risk to reach new levels of innovation. - Communication really is key.
Communication happens on many levels in this type of campaign: between the robots., between the pilots and the robots, between the field group and the control room group and within the groups. Maintaining clear lines of communications are aided technologies like Teams, Starlink, Neptus, acoustic modems, USBL which are always developing speed and reliability. However, communication technology is the innovation space that can make the most difference in reducing risks during remotely operated collaborative missions. Direct communication from the remote USV operators boats nearby would ensure good communication and reduce risk. - Field work FOMO (fear of missing out) is outdated.
Though it would have been great to have a week together at the Mausund field station, there are many benefits to a remote campaign. The possibility to sit in a room without distractions with several monitors showing all the information you need (AIS, cameras, real-time data streaming, weather, control monitors with battery etc.) allows the vehicle operators and researchers to make informed decisions about the scientific missions based on the information they are receiving. It is also a more sustainable and flexible approach to field work. - Situational awareness
The remote control room have clear disadvantages for situational understanding of the situation at sea, onboard the vessel you will feel the wind and waves on your body literally having a 360 degree view you will also hear certain sounds. However, the cognitive capacity in a control room is considerable higher compared to onboard wavy boat. Information streams such as AIS, traffic information meteorology and remote is easier conceived in a control. To ensure good situational awareness in the control room you cannot stream to many cameras and radars should also be applied to complement optical imagery.
Funded by SFI Harvest and contributions from Ahmed Abdelgayed, André Olaisen, Emily Venables, Halvar Gravråkmo, Karoline Barstein, Kay Arne Skarpnes, Leonard Günzel, Leif Gimsmo, Martin Bredesen, Pavel Skipenes, Rabea Rogge, Ralph Stevenson-Jones, Robert Staven and Torstein Nordgård.
Muriel Dunn is a Research Scientist at SINTEF Ocean in Trondheim, Norway, where she works in operational oceanography with the OceanLab observatory focusing on acoustics and data fusion. She earned a B.Sc. In Physical and Physical Oceanography from University of British Columbia, a M.Sc. in Physical Oceanography from Memorial University of Newfoundland (MUN) and a Ph.D. in Fisheries Sciences from the Fisheries and Marine Institute at MUN.
Martin Ludvigsen is professor and head of the Applied Underwater Laboratory (AUR-Lab) at the Department of Marine Engineering NTNU. Key research interests include autonomy, machine vision and machine learning, navigation, control, design and operation of underwater vehicles and instrumentation. He is professor II in marine technology both at Svalbard University Center (UNIS) in Longyearbyen and at UiT in Tromsø. Here he works with both Arctic marine measurement techniques, operations and transport. He is also associated with the Center for autonomous marine operations and systems (NTNU AMOS).
autonomous sampling, Martin Ludvigsen, Mausund, Muriel Dunn, ocean-going robots, optics, zooplankton