NGA Facilities Operating Status »
Contact Us »


Navigate Up

Wearable electronic devices augur change for NGA operations, show ‘immersive’ potential

By Jason Moll, Office of Corporate Communications

Developers at the National Geospatial-Intelligence Agency are creating applications for wearable electronic devices that place analysts and customers in virtual and augmented-reality environments to help them do their jobs better.

The prototype applications created for Google Glass and Oculus Rift could serve as gateways to the immersive intelligence experience being advanced by NGA leadership, said Matthew McNerney, a visualization engineering subject matter expert in InnoVision, the agency’s research and development arm.

Google Glass allows users to see their surroundings while interacting with the device, creating an augmented reality. The device projects digital information on translucent spectacles and obtains Internet connectivity from a nearby smartphone. Oculus Rift is a pair of opaque goggles that allows users to visualize virtual environments while connected to a computer or gaming device. Google Glass received its mass market release in May, while Oculus Rift is only available as a development kit.

“Immersion is touted as the next major phase of intelligence, while wearable (electronic) devices are seen as the next big thing in mobile computing,” said McNerney. “Putting them together seems like a natural fit.”

Wearable electronic devices are projected to eventually displace smartphones as the preferred mobile technology used by consumers, according to a January 2014 article in Wired magazine.

Besides eyewear, mobile electronic devices include rings that facilitate mobile payments, bracelets that serve as fitness activity trackers, and Internet-enabled watches with integrated cameras. InnoVision is charged with executing the immersive intelligence vision outlined by NGA Director Letitia A. Long at the GEOINT symposium in April. Long characterized immersive intelligence as a world where practitioners will live, interact and experiment with data in a multimedia and multi-sensory experience. The immersive experience will help break down barriers between collectors, analysts, customers and decision makers, she said.

Immersing analysts and operators in data and information will help them generate new insights as they view and interact with source material from different perspectives, said McNerney.

“By giving analysts a different kind of perspective, we’re hoping they will be able to more rapidly derive information from it and obtain meaningful intelligence from the imagery and other products they’re analyzing,” said McNerney.

NGA’s Google Glass application is designed to enhance the situational awareness of field operators working special security events or, sensitive law enforcement investigations, said Zachary Swain, who wrote the application while working in InnoVision. It allows the user to send and receive information to another NGA application that creates a common operational picture, or COP, over the Web. Working together, the applications help others understand what the operator is observing in the field. Wearing Google Glass also limits many of the common distractions users experience when holding smartphones or other handheld devices, said Ben Tuttle, Ph.D., who created the COP application as the mobile apps team lead for NGA’s Geospatial Intelligence Advancement Testbed in Denver.

Originally conceived for gamers, Oculus Rift allows users to visualize and navigate virtual environments in three dimensions, or 3-D. An NGA prototype application created by Tim Hattenberger, a visualization subject matter expert in InnoVision, lets users navigate 3-D “point clouds” created by light detection and ranging, or LiDAR, scans. Fundamentally similar to radar, each LiDAR point in a point cloud was obtained by scanning an area with pulses of laser light that identify an object’s range, or position in space. When coupled with a LiDAR city scan, Oculus Rift lets users virtually fly over a city or walk along its streets. The 3-D point cloud creates a city scene replete with trees, power lines, light poles and buildings.

Analysts primarily view LiDAR point cloud products on their computer screens, which gives them a top-down, or two-dimensional view, said Eric Aasted, an InnoVision project scientist. Oculus Rift could change that by providing users with a 3-D perspective, while also facilitating collaboration in a way similar to multiplayer online games. “By immersing themselves in the same data, Oculus Rift could let two analysts see (their avatars), walk across the landscape, annotate what they see and communicate what they’re seeing—all in real time—and even if they are in different (physical) locations,” said Aasted.

While wearable electronic devices like Google Glass and Oculus Rift could serve as gateways to the immersive intelligence experience, that experience is only likely to occur when devices work in concert with each other, said McNerney.

“When you have two devices communicating with each other, the sum becomes greater than each part working alone,” said McNerney.“That’s the idea of convergence, where two or more devices can provide capabilities that weren’t possible before.”