top of page

Above, Below, and In Between: The Multi-Platform Future of Environmental Data

  • Writer: Dustin Wales
    Dustin Wales
  • Jan 4
  • 7 min read

Updated: Jan 9



The most interesting remote sensing projects we work on don't use a single platform. They use several—drones overhead, autonomous surface vessels on the water, ADCP systems measuring flow, RTK rovers on the ground - each capturing data that the others can't reach. The magic happens not in any individual dataset but in how they combine.


This multi-platform approach isn't new in concept. Researchers have always combined data sources when studying complex environments. What's new is how seamlessly the integration works, how affordable the individual platforms have become, and how the resulting datasets can be delivered in formats that clients can actually use.


The principle is simple: every sensor platform has strengths and blind spots. Drones see the surface but not what's underneath. Sonar sees below water but not above. Ground-based sensors provide precision at points but miss the spatial context. When you combine them thoughtfully, the whole becomes genuinely greater than the sum of parts.


The River Cross-Section Problem

Consider a straightforward question: What does this river channel actually look like? Not just the water surface, not just the banks, but the complete cross-sectional geometry from floodplain to floodplain, including the submerged channel that you can't see from the air.


Traditional approaches attack this in pieces. Surveyors walk the floodplains with RTK rovers. Hydrographers take a boat across the channel with an echo sounder. Someone else wades the shallow margins. The datasets get combined afterward, often imperfectly, with gaps where the methods couldn't reach or couldn't agree.


The integrated approach looks different. An RTK drone with LiDAR captures the terrestrial portions - banks, floodplains, vegetation structure. A small autonomous surface vessel runs the channel with a single-beam or multibeam sonar, producing bathymetric profiles that extend from bank to bank underwater. Where the water is shallow enough for the green laser to penetrate, topo-bathymetric LiDAR can bridge the gap between aerial topography and sonar bathymetry, creating truly seamless surface models that transition from dry land through the waterline to the channel bottom.


A 2025 study on Sweden's Rönne River demonstrated this approach at scale, combining drone-based radar altimetry for water surface elevation, drone-mounted sonar and water-penetrating radar for bathymetry, and LiDAR for the surrounding terrain. The researchers achieved centimeter-level precision across the entire cross-section - air, water surface, water column, and riverbed - using platforms that could be deployed by a small team without specialized marine infrastructure.


The deliverable is a complete elevation model with no artificial boundaries between "land survey" and "hydrographic survey." For flood modeling, habitat assessment, or infrastructure design, this continuity matters enormously. The river doesn't distinguish between above and below waterline, and neither should the data that describes it.


Flow and Form: Adding the ADCP

Geometry is only part of understanding a river. The other part is how water moves through that geometry - velocity profiles, discharge calculations, sediment transport. This is where acoustic Doppler current profilers (ADCPs) enter the picture.


An ADCP measures water velocity at multiple depths simultaneously by bouncing sound waves off suspended particles and measuring the Doppler shift of the returns. Mount one on an autonomous surface vessel, and you can map the three-dimensional velocity structure of an entire reach - not just surface velocity, not just a single point, but the complete flow field.

Aeria team is collecting ADCP & Sonar data during a rhodamine study.
Aeria team is collecting ADCP & Sonar data during a rhodamine study.

Now combine that with the geometric data from drone LiDAR and vessel-mounted sonar. You have channel shape and water velocity at every cross-section. You can calculate discharge directly. You can identify where flow accelerates through constrictions, where eddies form behind obstructions, where sediment is likely depositing or eroding. The static geometry becomes a dynamic system.


For salmon habitat assessment, which we do regularly, this integration is transformative. Spawning suitability depends on velocity as much as substrate. Juvenile rearing depends on hydraulic complexity, the pools and eddies where young fish can hold position without exhausting themselves. You can't assess habitat quality from topography alone, and you can't assess it from velocity measurements without knowing the geometry that creates those velocities.


The Coastal Zone Challenge

Coastal and nearshore environments present the integration challenge in its most extreme form. The waterline moves with every tide. The interface between "land" and "water" is a zone, not a line. And the most ecologically and economically important areas are often the shallowest, exactly where traditional survey methods struggle most.


A ship-mounted multibeam sonar can map the seafloor in detail, but it needs several meters of water under the keel. An aerial LiDAR can map the beach and upland perfectly, but the infrared laser reflects off the water surface without penetrating. Between the two lies a gap—the intertidal and shallow subtidal zones that are arguably the most dynamic and important parts of the system.


Multi-platform approaches fill this gap from both directions. Bathymetric LiDAR, using green wavelengths that penetrate clear water, can map from the shore down to depths of 2-3 Secchi depths, roughly 10-25 meters in favourable conditions. Small autonomous surface vessels can operate in water as shallow as 30 centimetres, extending sonar coverage far closer to shore than crewed vessels can safely work.


The overlap zone is crucial. Where bathymetric LiDAR and sonar data both exist, you can cross-validate accuracy and calibrate between systems. Where photogrammetric depth estimates (derived from water colour in aerial imagery) overlap with sonar ground truth, you can extend shallow-water coverage to areas where neither LiDAR penetration nor vessel access is practical.


Research teams have demonstrated this integration, meeting International Hydrographic Organization Special Order standards, horizontal position error under 2 meters, vertical error under 25 centimetres, using combinations of drone imagery, USV sonar, and RTK ground control. The accuracy standards that govern nautical charting can now be met with deployable platforms that fit in a pickup truck.


The Autonomous Surface Vessel Revolution

Much of this integration depends on autonomous surface vessels (ASVs or USVs), and these platforms have matured remarkably in the past few years. Modern survey USVs are essentially floating sensor platforms - small, portable, battery-powered, capable of following pre-programmed survey lines with centimetre-level positioning.


The smallest units, like OceanAlpha's SL20 or Maritime Robotics' Otter, can be launched by hand from a riverbank or beach. They draw 10-30 centimetres, letting them survey waters where kayaks struggle. They carry single-beam or multibeam sonar, ADCP instruments, water quality sensors, whatever the mission requires. And critically, they share positioning infrastructure with the drones operating overhead, ensuring that data from both platforms align in the same coordinate system without post-processing gymnastics.


The operational model is remarkably efficient. While a drone surveys the terrestrial portions of a site, a USV runs parallel lines through the water. Both record data continuously. Both use RTK correction from the same base station or CORS network. When processing finishes, the datasets merge into a single surface model that crosses the waterline without discontinuity.


NOAA has recognized this convergence, incorporating USVs into their hydrographic survey fleet and testing dual-USV operations for increased coverage. The same thinking that makes drone surveys more efficient than crewed aircraft applies to surface vessels: smaller platforms, lower operating costs, deployable by smaller crews, able to access sites that larger vessels cannot reach.


Ground-Based Integration

Not everything can be measured from vehicles. Some applications require direct contact with the surface - soil sampling, infiltration testing, groundwater monitoring. Others require persistent measurement over time - stream gauges, weather stations, water quality sondes. These ground-based systems form the third leg of the integration stool.


The key is that ground-based measurements now share the same positioning infrastructure and data frameworks as remote sensing platforms. An RTK rover collecting soil samples uses the same correction network as the drone overhead. A water quality sonde transmits data to the same cloud platform that processes drone imagery. The measurements aren't isolated points anymore; they're located precisely within the spatial framework created by the aerial and surface surveys.

GNSS data collection for photogrammetry data correction.
GNSS data collection for photogrammetry data correction.

For environmental monitoring, this integration enables new kinds of analysis. You can correlate water quality measurements with upstream land use visible in drone imagery. You can relate soil chemistry to vegetation health mapped from multispectral cameras. You can track how conditions at a monitoring station relate to conditions across the broader landscape. The point measurements gain context, and the remote sensing gains ground truth.


The Data Integration Challenge

The technical challenge in multi-platform work isn't usually data collection; it's data integration. Each sensor produces data in its native format, at its native resolution, with its native coordinate system and timestamp convention. Making these datasets talk to each other requires careful attention to:


Coordinate reference. All platforms must work in the same geodetic datum, projection, and vertical reference. For projects spanning land and water, this can be surprisingly complex. Terrestrial surveys often use orthometric heights (related to mean sea level), while hydrographic surveys use chart datum (related to low water). Reconciling these requires explicit transformation.


Temporal alignment. In dynamic environments, conditions change between surveys. Water levels fluctuate. Vegetation grows. Sediment moves. Integrating datasets collected hours or days apart requires understanding what changed and accounting for it.


Accuracy reconciliation. Different sensors have different accuracy characteristics. Drone photogrammetry might achieve 2-3 cm vertical accuracy; bathymetric LiDAR might achieve 10-15 cm; photogrammetric depth estimation might achieve 30-50 cm. Where datasets overlap, which takes precedence? How do you blend surfaces with different error characteristics without creating artifacts?


Format compatibility. Point clouds, rasters, vectors, and time series - each has appropriate uses and limitations. Choosing the right format for integrated deliverables and providing data in formats that clients' software can actually use, is as important as the collection itself.


Getting this integration right requires understanding both the technology and the science. You need to know what each sensor actually measures, how its errors behave, and what assumptions are embedded in its processing. This is where expertise matters - not just in operating equipment, but in understanding what the data mean and how they can be combined responsibly.


What Integrated Data Enables

The point of all this integration isn't technical elegance; it's answering questions that couldn't be answered before.


How does this restoration project change habitat conditions from headwater to estuary? You need continuous data across the entire gradient, terrestrial, riverine, and marine, to answer that.


What's the actual flood risk to this infrastructure? You need topography and bathymetry in the same model, with flow data to drive simulations, to answer that.


How is sediment moving through this system? You need repeat surveys of both the channel and the floodplain, with velocity data to understand transport capacity, to answer that.


What's the baseline condition before this development project, and how will we know if impacts occur? You need comprehensive multi-sensor documentation that can serve as a defensible reference, the kind of dataset that stands up to scrutiny because it captures the system as a whole, not just the parts that were easy to measure.


The platforms are tools. The sensors are tools. The integration is where the value emerges, in creating an understanding of complex systems that don't respect the boundaries between air, land, and water.


---


Aeria Solutions operates across all three domains—aerial drones, surface vessels, and ground-based systems. Our work often integrates data from multiple platforms into unified deliverables for clients in resource development, environmental consulting, and Indigenous-led monitoring programs. The technology is impressive; the integration is where we add value.


 
 
 

Comments


bottom of page