Skip to main content
U.S. flag

An official website of the United States government

AI revolutionizes mapping updates and accuracy

AI revolutionizes mapping updates, accuracy

An NGA Small Business Innovation Research (SBIR) partner is well on its way to creating an artificial intelligence-powered prototype to provide global mapping at near real-time in unprecedented resolution. AI-generated maps in 10-meter resolution are already available to NGA’s analysts and customers.

NGA SBIR partner Impact Observatory, along with Esri and Microsoft made history in summer 2021 by launching the first publicly available land-cover map at 10-meter resolution. The map was the first to use machine learning and big data to update land-cover maps created from human-tagged satellite imagery. It was produced with imagery from Europe’s Sentinel satellite constellation, with funding from Esri and donated server space and computer from Microsoft.

In a current partnership with NGA that commenced in July 2022 and runs through January 2024, the company is developing an AI-powered prototype to provide global mapping and change monitoring even faster and with greater detail.

“A truly living map produced with deep learning is no longer a science fiction idea but is something you should come to expect,” said Steve Brumby, Ph.D., chief executive officer and chief technology officer of Impact Observatory, while sharing status of the research with members of NGA’s workforce last month.

Traditionally, human-dependent maps are updated over the course of years. The most current global land-cover map available from the U.S. Geological Survey was last updated in 2019. These new AI-generated maps can be updated whenever new imagery is available — with the ultimate goal of continuous, automated updates at a global scale. Impact Observatory’s AI-powered maps are already able to show seasonal changes, as well as impacts from events such as the war in Ukraine.

“I think this is a radical leap forward in tipping,” said Brumby. “We believe this just opens the universe for what can be done.”

The key is the ability to use machine learning and automation to process the huge amount of data now available from a growing number of commercial and small satellites, and even drone imagery — and to overlay images from different sources to build detail. The 10-meter maps currently apply a deep learning model to 2.4 million satellite scenes, requiring over 1 million CPU core hours to process. Of note, the model slims down the data to extract only what’s needed to create the maps.

NGA R&D partner Impact Observatory is developing an AI-powered mapping capability to identify multiple classes of land cover in near real-time. Updates can be made whenever new imagery is available. Image provided by NGA R&D partner Impact Observatory.
NGA R&D partner Impact Observatory is developing an AI-powered mapping capability to identify multiple classes of land cover in near real-time. Updates can be made whenever new imagery is available. Image provided by NGA R&D partner Impact Observatory.
A developing AI-powered land-cover mapping capability goes beyond indicating areas of tree cover to enable identification of actual tree types. Image provided by NGA R&D partner Impact Observatory.
A developing AI-powered land-cover mapping capability goes beyond indicating areas of tree cover to enable identification of actual tree types. Image provided by NGA R&D partner Impact Observatory.

At the current stage of research and development, the AI-generated maps are able to show crops, trees, rangeland, built areas, water, flooded vegetation, snow/ice and bare ground. They show areas of new construction, deforestation, fire and burn scars, changes in snow and ice, and other environment-related data points. The team is adding new classes of objects, such as roads, other impervious surfaces, and actual buildings, and team members are beginning to be able to distinguish precise types of vegetation, crop stages and more. They also plan to augment imagery with remote sensing data in certain instances and continue to add new data sources to enrich the detail available in the global map.

Amy Larson, Ph.D., Impact Observatory’s technical executive, stressed that outcomes have proven consistent during testing, even when applying training data from one area to a completely different area. Thus, artificial intelligence is helping to close gaps in understanding of landscapes where actual imagery is unavailable.

Sandy Brusiloff, who cohosted the update talk along with Aaron Reite, Ph.D., and Enrique “Tres” Montaño, Ph.D., all of NGA Research, said another benefit of the AI-powered maps is that layers can be added or removed. Classified data can be layered on top to provide greater fidelity for NGA and other mission analysts. When kept at the unclassified level, the maps are sharable with firefighters and other first responders when requested.

 

Article written by Jeanne Chircop, NGA Research Communications