You will be redirected to the page you want to view in  seconds.

NGA taps crowd to improve disaster response

Jul. 3, 2014 - 06:00AM   |  
By ADAM STONE   |   Comments
NGA analysts study a set of data.
NGA analysts study a set of data. (NGA)

Geospatial intelligence, or geoint, capabilities have been assuming a larger role in supporting homeland security missions, such as disaster response. And experts say that is certain to continue as better images are delivered and analyzed faster and more effectively mashed up with other data sets.

A recent move by the National Geospatial-Intelligence Agency (NGA) may help further accelerate the pace of geoint advancement. A partner in the Homeland Security Infrastructure Program, NGA recently joined GitHub, a social network that allows programmers to collaborate and share computer code.

By sharing code with a broad community of developers, “NGA hopes to reap benefits in innovation, creativity, and the power of a far-reaching community of programmers who approach the development of the program from different perspectives,” said Chris Johnson, executive committee member of the geoint trade group GEO Huntsville.


The new intelligence frontier

NGA director to retire

NGA stepped into GitHub by sharing its code for GeoQ, a tool first developed for humanitarian assistance and disaster recovery. The Federal Emergency Management Agency helped refine the tool as the backbone of a shared disaster response solution across government.

“We built GeoQ on all open-source frameworks to make it easily shareable with our mission and response partners,” said Ray Bauer, technology lead for NGA’s Readiness, Response and Recovery team, in a release announcing the move to GitHub. “What we’re hoping for now is to spark interaction with the GitHub communities to improve the code. As long as you have access to the Internet, you can be a part of the solution.”

Better, faster

A more collaborative approach to software development is just one aspect of the changing nature of geoint in homeland security. Rapid development of code comes hand in hand with an ever-accelerating pace of technological innovation.

(Page 2 of 2)

“Five years ago we were high-fiving in the hallways if we could get a one-megabit stream of information,” said Karl Fuchs, vice president of technology at satellite communications company iDirect Government Technology.

With the advent of high-definition video, the geoint community has pushed that envelope. “Today we see high-definition feeds at a minimum two megabits per second, and more typically six megabits,” he said.

Engineers have found ways to use smaller chips to reduce power consumption, along with very small antennae. High-throughput satellite networks help homeland security to get data faster, while narrowly focused beams makes it easier to send and receive this accelerated information.

All these advances enable homeland security operators to do their jobs more efficiently. “The guy in the field is now being provided with the raw high-definition data so he can look directly at that data and make his own decisions,” Fuchs said.

Flood of data

Even as the speed of geoint improves, homeland security faces a new challenge: Making sense of this new torrent of data.

“It’s more than just pictures. It’s the intersection of the images, the analytics and the richness of all this vast manner of geospatial intelligence,” said Keith Masback, CEO of the U.S. Geospatial Intelligence Foundation. “Data is only your friend if you have the analytic tools and the people to do something with it.”

While analysis remains a challenge for homeland security and others in the geoint community, observers say strides are being made.

“Across the industry we are turning to new methods to simplify these data sets, whether it is through visualizing techniques or the use of new computing techniques,” Talbot Brooks, director of the Center for Interdisciplinary Geospatial Information Technologies at Delta State University.

To manage so much information, a number of elements must come together, most of which relate to process and the appropriate disposition of resources. Analysts need significant bandwidth to gather data and push out intelligence information. They need access to the cloud and other means of sharing data across multiple points. And they need to know what they are looking for – a burden that falls squarely on the shoulders of homeland security operators.

“I don’t think we have an answer to all of that yet,” Brooks said. “We are just starting to figure out what is really out there, and then maybe we could do something useful with it.” ■

More In Federal IT

More Headlines