No Confusion Over Data Fusion



By Mary Jo Wagner

Digital airborne sensors. LiDAR (light detection and ranging) sensors. Hyperspectral sensors. Digital frame cameras. All aerial survey instruments that have their rightful place in the airborne mapping industry, each known for its particular application suitability. That dutiful categorization however can lead companies and clients alike to confine themselves to a separatist approach to collecting aerial surveys—applying a single sensor or flying different sensors with different planes at different times—and make it difficult for them to answer seemingly simple, benign questions such as “What is that feature?”



“The problem with flying sensors in isolation is that no one sensor will provide all the data analysis questions,” says Mark Romano, CTO of Orlando-based Earth Eye, a LiDAR aerial survey company. “With LiDAR, I may not necessarily be able to identify a ‘bump‘ in the terrain as a car or earthworks. With hyperspectral, I can classify a particular tree type but I may not be able to tell you how tall it is. These are important differentiators in developing accurate data products.

“To resolve those uncertainties, companies typically fly project areas again with different imaging technologies to try to supplement the single-sensor dataset with more feature-rich information,” continues Romano. “Unfortunately, since they’re not flown at exactly the same time, the two don’t fit well together spatially and they are not time coherent. That makes it very difficult to create a value-added data product for clients to integrate into their analysis applications.”

Earth Eye—a very recent entrant into the airborne survey world—aims to challenge the standard separatist tendencies with its emergent data fusion approach. Rather than create disparate datasets, Earth Eye’s vision is to connect multiple sensors to one payload, fly a project area one time acquiring the multi-sensor data simultaneously and then automatically fuse the data sources into one cohesive, content-rich dataset. Indeed, if Romano’s objective succeeds, the once-neat application lines of aerial survey technologies will be blurred to create well-defined, almost life-like 3D datasets, allowing users to “walk the ground” from their desktop. That not only means familiar terrain can be visualized in a whole new light, it may enable Earth Eye to expedite various flight paths to a host of new business development prospects.

“Plug-n-collect”

To be sure, the data fusion concept is not a new one. Research and development scientists and remote sensing users have been studying and successfully fusing earth-observation data together for more than a decade. LiDAR imagery has been part of the fusion fold since early 2000. And the predominant method of integrating the diverse data layers has been in post-processing—merging the disparate datasets together to try to create seamless information products—often prompting users to allude to the need for concurrent data capture to ease the fusion process and improve the qualitative results.



With significant advancements in hardware, data quality and computing power, Romano says the aerial mapping environment is now primed to evolve aerial surveys to real-world levels of visualization and analysis. That emergence, along with enabling airborne technology from Leica Geosystems, were the key factors to launching Earth Eye in June 2009, says Romano.    

“We wanted to focus on the data fusion of airborne LiDAR, multispectral and hyperspectral data and we wanted to create a niche doing simultaneous airborne capture and processing analytics,” he explains. “That meant we needed a modular platform. Leica’s ALS60 LiDAR sensor not only provides a modular airborne platform, we believe it is the most capable commercially available technology available. It offers more dynamic range so we can fly higher and acquire data with a higher pulse rate, providing more feature information more cost effectively. The sensor has about a six-and-one-quarter-inch aperture so it gathers more light, and in conjunction with its automatic gain control abilities, it is able to offer a superior solution for capturing data in heavy or steep terrain. And we can integrate up to five additional sensors to one platform with one single inertial system. That gives us amazing range to deliver highly accurate, time-coherent information products.

“For example, with Leica’s RCD105 digital frame camera I can achieve quarter-foot resolution and sub-foot resolution with the ALS60,” he adds. “When we fuse those two together and rotate the model in 3D, it’s like you’re walking on the ground. That is tremendous knowledge for clients to have for their business.”

That improved detail and accuracy raises the information-intelligence quotient for users, helping them to eliminate the guesswork of feature-data analysis to create more precise and comprehensive feature classifications.

What would you like to see?

Having such new-found intelligence could be of particular benefit to users in the utility, forestry and environmental sectors where engineering-grade and survey-grade data is critical for modeling, mapping, inventorying and planning.

In order for hydro analysts to accurately run flood study models to determine when and where a waterbody will overflow, they need to have topographic maps that show “flattened” waterbodies, terrain detail and precisely connected streams or rivers flowing downhill. Without precise land/water body delineations or seamless river channels, analysts may not be able to distinguish berms or narrow ditches or certain grasses may appear as level ground. Capitalizing on the strengths of airborne technology, collecting multi-sensor data simultaneously and fusing the imagery together provides the “hydro-enforced” dataset analysts need to perform flood models with more confidence, says Romano.

“By flying our multi-sensor payload, water managers will see where the water on the bank of the river really is and what terrain features surround the water body,” he says. “Complementing LiDAR with the RCD105 camera is very useful because it provides that second take at the precise time the LiDAR sensor acquires the same point, which is particularly important with water-related mapping because stream or river elevations or tidal elevation can change dramatically, sometimes even within minutes. Our fusion process really aids us in making precise and accurate hydro-enforced terrain representations.”

Forestry professionals are avid classifiers of tree types, tree attributes such as maturity levels, fuel layers and water resources. Relying on one airborne technology or disparate, time-delayed datasets can make it difficult to accurately monitor and manage forestry assets.

“Fusing both hyperspectral data with LiDAR data can allow analysts to identify both a tree type and how tall it is,” says Romano. “If an organization conducts cyclic aerial surveys, they could even monitor the growth rates of those specific tree types. Extending the information further, they could study watersheds to determine how wet an area is, determine projected growth rates and identify fuel layers and their associated risks. That amount of detail can help users achieve a 90-percent quantifiable classification or projection. Many sensors flown at different times will not give you as quantifiable an answer.”

Earth Eye may soon gets in own quantifiable response to those predicted capabilities from the United States Forest Service (USFS)—Idaho. As part of a fire-fuels research study, the USFS Idaho commissioned Earth Eye to conduct a multi-sensor aerial survey over a swath of forested terrain last summer to determine whether they can use the integrated datasets to classify different tree types and model certain tree canopies and fuels.

Using the Leica ALS60 and the Leica RCD105 camera, Earth Eye flew more than 60-square-miles of heavy terrain, acquiring both LiDAR data and multispectral imagery concurrently and fused the datasets together. They flew at an altitude of 14,500 MSL and collected multi-pulse LiDAR data at 70kHz, yielding ground point densities of approximately five points per square meter on average; the Leica RCD105 imagery was collected at one-foot resolution. Based on the integrated information, researchers will validate the LiDAR and study canopy closures, tree types, volumes and fuels.

“In addition to the data we also provided them the software tools to allow them to process and visualize the data themselves—a capability that have not been able to do in the CAD environment,” says Romano.

In between conducting commissioned surveys, Earth Eye has been carrying out its own multi-sensor surveys over areas such as landfills, road corridors, urban areas, wetlands and power lines to try to expand its technological approach to other markets. But Romano doesn’t envision many bumpy business-development flights.

“The technology and fusion approach sells itself,” he says. “Once clients see the capabilities and the information value that’s there, they immediately recognize that they can maximize the value of their data.”
Mary Jo Wagner is a Vancouver-based freelance writer with more than 15 years experience in covering geospatial technology. She can be reached at: mj_wagner@shaw.ca.

» Back to our Aerial Mapping March 2010 Issue

Website design and hosting provided by 270net Technologies in Frederick, Maryland.