Sensor fusion transforms aerial mapping
Article

Sensor fusion transforms aerial mapping

How integrated sensors deliver comprehensive geospatial intelligence

As demand for comprehensive geospatial intelligence grows across sectors from agriculture and forestry to infrastructure management and urban planning, sensor fusion is emerging as the key enabler of more efficient, accurate and actionable aerial mapping missions. The technique represents a fundamental shift from single-sensor operations towards integrated sensing platforms that leverage the combined strengths of multiple technologies.

When a survey aircraft flies over Italy’s ancient olive groves, multiple sensors working in perfect harmony can detect a deadly bacterial infection weeks before any farmer notices the first signs of disease. This is just one example of how the convergence of multiple sensing technologies – known as sensor fusion – is transforming aerial surveying from single-purpose data collection into comprehensive environmental intelligence gathering.

The evolution of multi-sensor platforms

Over the past decade, the concept of sensor fusion in aerial surveying has evolved from experimental research to operational reality. While early missions typically deployed single sensors optimized for specific applications – Lidar for topographic mapping, RGB cameras for photogrammetry, thermal imagers for specialized inspections – modern platforms increasingly integrate multiple sensors to capture comprehensive datasets in single-flight operations. This evolution reflects both technological advancement and market demand. Miniaturization of sensors, improved data processing capabilities and declining operational costs have made multi-sensor configurations technically feasible and economically viable. Simultaneously, end users across industries require richer, more contextual geospatial information to support complex decision-making processes.

Crewed aerial platforms offer distinct advantages for sensor fusion applications. Unlike uncrewed systems, crewed aircraft can accommodate multiple high-end sensors simultaneously, provide extended flight endurance for large-area coverage, and offer real-time operator oversight of data collection quality. These capabilities prove essential for missions requiring precise coordination between multiple sensing systems.

The article continues below the image.

In September 2022, hyperspectral images were acquired in three bands (R: 695nm, G: 515nm and B: 410nm) in the Apulia region of Italy. A mask of ‘infected’ (yellow) and ‘not infected’ (red) tree crowns was overlapped with the RGB images to show the spatial distribution of the trees in each dataset: Gorgognolo (b) and Polignano (c). (Image courtesy: D’Addabbo et al.)

Core sensor technologies and their synergies

Modern sensor fusion configurations typically integrate several complementary technologies, each contributing unique capabilities to the overall dataset:

  • Lidar systems provide the geometric foundation for most fusion applications. These sensors emit laser pulses and measure return times to generate precise 3D point clouds of terrain and surface features. Lidar excels in areas where optical imagery faces limitations – such as dense vegetation canopies – and delivers consistent geometric accuracy regardless of lighting conditions. When combined with other sensors, Lidar data serves as the spatial framework for registering and interpreting complementary information.
  • RGB cameras capture high-resolution true-colour imagery, forming the visual foundation of most surveys. They provide the reference dataset for interpretation, mapping and visualization that end users readily understand.
  • Multispectral sensors expand on RGB by adding near-infrared (NIR), enabling vegetation health analysis, species differentiation and environmental monitoring. NIR reflectance reveals plant stress and biomass data essential for agricultural applications.
  • Hyperspectral sensors collect reflectance data across hundreds of narrow spectral bands, enabling precise material identification for mineral exploration, precision agriculture and water quality assessment. These complex datasets typically require georectification to spatial frameworks provided by Lidar or multispectral imagery.
  • Thermal cameras detect heat signatures emitted from surfaces, making them ideal for energy efficiency audits, infrastructure inspections and environmental monitoring. Thermal sensors operate effectively in low-light conditions and can reveal subsurface conditions invisible to optical sensors. When fused with RGB imagery and Lidar data, thermal information provides both spatial context and interpretive depth for anomaly detection.
  • Bathymetric Lidar uses green-wavelength laser pulses to penetrate water and measure depths, enabling accurate mapping of shallow coastal zones, riverbeds and inland water bodies. When integrated with topographic Lidar and imagery, bathymetric data supports seamless terrain modelling across land-water interfaces, benefiting flood modelling, habitat monitoring and infrastructure planning in aquatic environments.
  • Synthetic aperture radar (SAR) offers unique capabilities for fusion applications, particularly its ability to penetrate clouds and operate in all weather conditions. Though less common on crewed platforms due to payload constraints, SAR contributes valuable information about surface conditions and ground movement when integrated with optical and Lidar datasets.

How does the magic happen?

Successful sensor fusion requires precise coordination of multiple data streams with different temporal, spatial and spectral characteristics. At the core of any fusion process lies synchronization, ensuring that data from different sensors can be accurately aligned and integrated:

  • Temporal synchronization uses precise timestamps to align data captured simultaneously during flight operations. Modern systems typically employ GPS time references to coordinate sensor triggering across platforms.
  • Spatial alignment requires careful calibration of sensor positions and orientations relative to the aircraft coordinate system. This boresight calibration process ensures that data from different sensors can be accurately co-registered in geographic space.
  • Georeferencing systems, particularly integrated GNSS/IMU platforms, provide the common spatial reference frame that enables fusion of datasets from multiple sensors. These systems deliver precise position and orientation information that transforms sensor measurements into georeferenced products.

Processing workflows typically involve several stages of data integration. Point cloud-texture fusion overlays RGB or multispectral imagery onto 3D Lidar models, creating visually rich and geometrically accurate representations. Orthorectification processes use Lidar-derived elevation data to correct geometric distortions in imagery, ensuring accurate spatial registration across datasets.

Advanced processing techniques increasingly employ artificial intelligence and machine learning algorithms to extract meaningful information from fused datasets. These tools can identify patterns and relationships across different sensor types that would be difficult to detect through manual analysis.

High-resolution nadir imagery from the Phase One GS120 camera (1.9 cm GSD), detailed enough to reveal pavement cracks and surface defects (Image courtesy: Phase One)

Sensor fusion in action: real-world applications

Case 1: Airborne sensor fusion technology protects Italy’s olive heritage

The Mediterranean basin is one of the world’s most important regions for olive oil production. Countries like Italy, Spain and Greece are among the top producers of this precious commodity. Olive trees have been cultivated in this region for thousands of years, and the Mediterranean climate and soil conditions are ideal for the growth and development of these ancient trees. Volume and price of the olive oil vary widely from one year to another. The recent low production had led to a sharp increase in prices and volume of the exports. Some of this variation in production is due to an invisible threat: Xylella fastidiosa, a bacterial pathogen that devastates trees before visual symptoms appear.

Via the REDoX project, the Italian Centro Nazionale delle Ricerche (CNR) – Istituto per il Rilevamento Elettromagnetico dell’Ambiente developed an innovative solution using airborne sensor fusion technology to detect infections in their earliest stages. CNR utilized airborne sensors from Itres Research Ltd and deployed a crewed aircraft equipped with complementary sensing technologies. The CASI-1500 hyperspectral sensor captures data across 288 spectral bands in visible and near-infrared wavelengths, while the MICRO TABI 640 thermal imager operates in the 3.7-4.8µm range for canopy temperature analysis. Flying at optimal altitude, the integrated sensor system delivers 50 × 50cm ground resolution imagery, enabling individual tree analysis across hundreds of hectares per flight mission.

Processing teams can extract 56 vegetation indices from hyperspectral data and thermal stress indicators, generating 62 variables per tree. Advanced radiative transfer models translate spectral measurements into comprehensive physiological profiles. The combined dataset feeds Support Vector Machine classifiers trained on ground-truth data from qPCR testing. Classification accuracies exceed 74%, successfully identifying asymptomatic infections in apparently healthy trees.

This early detection capability transforms disease management from reactive to proactive intervention. The project incorporates diverse geographic locations and olive cultivars to develop robust detection models applicable across Mediterranean regions. This scalability proves essential for addressing threats that transcend boundaries, contributing to efficient agricultural management and preservation of the area’s irreplaceable olive heritage.

Case 2: Hybrid sensor fusion for corridor and pavement mapping in the USA

In early 2024, a corridor mapping project was conducted in Oro Valley, Arizona, to assess road surface conditions and pinpoint areas in need of maintenance. The survey was performed by consultancy McKim & Creed using a hybrid sensor setup housed in the Aispeco Helix LITE pod aboard a crewed aircraft. The survey combined several advanced sensors to create a unified dataset. At its core was the Phase One GS120 camera, capturing nadir imagery at a 1.9cm ground sampling distance (GSD), which was sharp enough to detect surface-level issues such as pavement cracks. This was complemented by a RIEGL VUX-240 Lidar scanner, delivering precise 3D point cloud data, and an Applanix AP+50 GNSS/IMU system ensuring accurate georeferencing of all sensor outputs.

This sensor fusion approach enabled the simultaneous capture of high-resolution imagery, 3D terrain data and structural features such as poles, wires and bridge clearances. Data was processed using Esri’s ArcGIS Reality Studio, producing an integrated digital environment that included 3D meshes, true orthophotos, DSM, and infrastructure classifications.

By merging multiple data streams into a single source of geospatial truth, the project delivered a detailed and operationally valuable overview of the corridor. It allowed transportation officials to assess road quality, detect early signs of degradation and make informed decisions about maintenance scheduling. This example highlights the increasing role of hybrid airborne systems in supporting public infrastructure management – delivering not only greater spatial detail but also faster, more comprehensive analysis from a single flight operation.

The data obtained was processed in Esri’s ArcGIS Reality Studio to generate 3D meshes, true orthophotos, DSMs and infrastructure classifications. (Image courtesy: Esri / Aispeco)

Case 3: Sensor fusion for benthic habitat mapping in the Great Lakes

To support the goals of the Great Lakes Water Quality Agreement, a comprehensive mapping project was launched to classify nearshore benthic habitats across the Laurentian Great Lakes. Led by a consortium using Teledyne Geospatial technology, the initiative combined multiple airborne and satellite sensors with in situ observations to create high-resolution habitat maps for ecosystem restoration and resource management.

The core of the sensor fusion approach was bathymetric Lidar (Teledyne Optech CZMIL, 532nm), which provided detailed digital elevation models of the submerged lakebed and water column. This was complemented by topographic Lidar (1,064nm) to model the coastal terrain and vegetation, ensuring seamless integration between terrestrial and aquatic zones. To characterize substrates and aquatic vegetation, hyperspectral imaging (Itres CASI-1500 and PhaseOne 150MP) was deployed alongside satellite imagery to extend spatial and temporal coverage. Diver-collected ground truth data, acoustic surveys and irradiance measurements ensured reliable model calibration.

The project adopted the NOAA Coastal and Marine Ecological Classification Standard (CMECS) and applied machine learning techniques to fuse data sources and deliver multi-tiered classification outputs ranging from general substrate types to species-level differentiation. The approach achieved over 90% classification accuracy for substrate and biotic features, providing actionable data for habitat restoration, conservation zoning and environmental impact assessments while demonstrating transferability to other shallow coastal ecosystems.

Image 1: First-tier classification of substrate. Most of the lake floor is fine, while some areas are coarse or feature anthropogenic elements, such as submerged man-made infrastructure. Image 2: Second-tier classification of substrate. Determined the substrate type as sand, with some areas showing ripples. Image 3: A binary map shows the simple presence (vegetation) or absence (no vegetation) of plant life. Image 4: A final classification identifies the specific species of vegetation where it is present. (Image courtesy: Teledyne Geospatial)

Case 4: Hybrid Lidar and imagery acquisition for rail corridor mapping in Styria

AVT’s project for ÖBB Infrastruktur AG demonstrates the value of integrated sensor platforms in infrastructure monitoring. Using the UltraCam Dragon hybrid aerial mapping system from Vexcel, AVT surveyed rail corridors in Austria’s Styria region, specifically along the Steirische Südbahn and Radkersburger Bahn lines. The mission’s aim was to support planning for the modernization and expansion of the region’s railway infrastructure. 

The UltraCam Dragon combines high-resolution nadir and oblique RGBI imagery with precise elevation data captured by an integrated 2.4MHz RIEGL Lidar sensor. This synchronized cost-effective data acquisition within a single system ensures perfect alignment between imagery and Lidar data, eliminating the need for separate flights or dual-sensor aircraft configurations. The Dragon offers a dedicated corridor mapping collection mode with sideward-looking obliques disabled.  

Deliverables included a classified point cloud (ground/non-ground), digital terrain and surface models (DTM and DSM), and RGBI imagery and orthophotos at a ground sampling distance of 2.5cm. These datasets enable planners to perform accurate assessments in areas where conventional survey techniques reach their limits. They support a wide range of applications, from noise modelling and route alignment to precise terrain evaluation. 

Photogrammetric measurements were conducted using measuree, AVT’s browser-based tool for 3D evaluation of oblique imagery. (Image courtesy: AVT Airborne Sensing)

Future directions and challenges

Several key trends are shaping the future of sensor fusion technology. Real-time processing capabilities are emerging for applications requiring immediate decision-making, such as emergency response and dynamic environmental monitoring. Artificial intelligence and machine learning are playing increasingly important roles, enabling the identification of complex patterns across different sensor types and more sophisticated analysis of integrated datasets. However, challenges remain in data volume management, processing requirements, standardization of workflows and the integration of multiple sensors on single platforms while considering payload constraints and operational complexity.

Sensor fusion represents a fundamental shift in aerial survey methodology towards integrated sensing platforms that leverage the combined strengths of multiple technologies. The real-world applications across diverse sectors demonstrate practical benefits in efficiency, accuracy and actionable geospatial intelligence. As technology continues evolving, sensor fusion is positioned to become the standard approach for comprehensive spatial data missions, with crewed platforms offering unique advantages in payload capacity, flight endurance and operational flexibility for next-generation geospatial applications.

Lidar point cloud from a rail corridor survey by AVT Airborne Sensing using the UltraCam Dragon. Data acquisition parameters: 2.5cm GSD, 620m AGL, 120 knots, 80%/30% overlap (forward/side), 2.4MHz PRR, 500 lines/sec, 45pts/m², sun angle ≥ 35°. (Image courtesy: AVT Airborne Sensing)

Further reading

https://www.internationaloliveoil.org/olive-sector-statistics-april-may-2025

D’Addabbo, A., Matarrese, R., Lovergine, F., Refice, A., Belmonte, A., Bovenga, F., Gallo, A., Amoia, S.S., Abou Kubaa, R., Mita, G., et al. Toward an Operational System for Automatically Detecting Xylella fastidiosa in Olive Groves Based on Hyperspectral and Thermal Remote Sensing Data. Remote Sens. 2025, 17, 1372. https://doi.org/10.3390/rs17081372

Reif, M.K., Krumwiede, B.S., Brown, S.E., Theuerkauf, E.J., and Harwood, J.H., Nearshore Benthic Mapping in the Great Lakes: A Multi-Agency Data Integration Approach in Southwest Lake Michigan, Remote Sens. 2021, 13(15), 3026. https://www.mdpi.com/2072-4292/13/15/3026

Geomatics Newsletter

Value staying current with geomatics?

Stay on the map with our expertly curated newsletters.

We provide educational insights, industry updates, and inspiring stories to help you learn, grow, and reach your full potential in your field. Don't miss out - subscribe today and ensure you're always informed, educated, and inspired.

Choose your newsletter(s)