In late August, researchers at the South Pole discovered conclusive evidence of the cosmic neutrino — a building block of the universe so elusive it's known as the ghost particle. The science and technology involved in such discoveries is awe-inspiring, but the network infrastructure required to move and manage the data being collected, while not as glamorous, is a daunting task in its own right.

Consider that finding evidence of neutrinos requires a massive facility made out of a cubic kilometer of ice, so scientists must travel to the ends of the Earth for a chance at detecting them. So how, from the South Pole, does one get the massive amounts of data collected daily into the hands of researchers who can use it?

The answer to that question is of great value not only to the National Science Foundation's South Pole Station, but to the spectrum of government leadership focused on high-performance computing.

As big data comes to the fore, "the real value will come from the ability to virtually move that data around geographically to where the principal scientists are and the research groups are," said Anthony Robbins, vice president of federal for Brocade, referencing a recent presidential executive order on high-performance computing. "Underlying the success of this initiative will be how good is the network to move the data."

The IceCube Neutrino Collaboration alone generates several hundred gigabytes to a terabyte of data a day, as does the South Pole Telescope, plus a handful of smaller Antarctic projects that produce a couple of gigabytes each. The data go through minimal cleanup at the base stations before being thrown out into the world through a well-choreographed series of jumps across a cobbled-together network.

"At the geographic South Pole, you just don't run out and buy standard commercial satellite service," said Patrick Smith, NSF manager for technology development and polar research support. "We have to look for pretty unique satellites," such as NASA's Tracking Data Relay Satellite, which flies in a tilted orbit and enters the Antarctic horizon for about six hours each day.

Using NASA's TDRS system, the South Pole teams have about four hours a day to upload their data, which is then relayed to a processing center in White Sands, New Mexico. The data is then copied and backed up before being transferred to file-sharing servers at a data center in Colorado.

From there, research groups from around the world can access the information, parse the data and work on our understanding of the fundamental nature of reality.

None of that would be possible without the complex infrastructure in place to move and manage the data.

The research being done at the South Pole is "opening the doors to a new era in particle physics," said Vladimir Papitashvili, astrophysics and geospace sciences program director in the NSF Division of Polar Programs. "And it became possible only because of extraordinary qualities of Antarctic ice and NSF's ability to successfully tackle enormous scientific and logistical problems in the most inhospitable places on Earth."

Due to the narrow window of time available each day, the South Pole research groups coordinate their daily data dumps for maximum efficiency before sending it to the sky at some 300 Mbps per second. After the data make a stop at White Sands, the rate slows a bit to the FTP data center but the information is often available to researchers within 12 hours of the original transmission.

The satellite system is complicated and provides for only limited transmissions, but it is easily the best option of those available. Smith said his department looked into running a fiber optic cable out to the South Pole station but quickly ran into logistical issues.

"They said, 'Hey, the ice moves,' " he said, recalling preliminary discussions on the feasibility of running a hardline. "Over time the cable jacket starts getting locked into the top layer of ice and everything on the Antarctic ice sheet is moving ... If you had a long cable [about 1,700 kilometers] that puts a lot of stress," as the ice sheets shift about 10 meters every year.

Maintaining a hardline to the station would require a huge upfront cost and regular maintenance on a scale that just wouldn't be practical. Market research showed it would have been a bit cheaper for NSF to buy and launch its own satellite.

Without a hardline option or the resources to launch its own satellite, NSF had to go "dumpster diving," Smith said, tapping into the government's network of older satellites.

"NASA's been an outstandingly great partner in all this," he said, citing early talks in 1990s about co-opting some of the space agency's aging communications infrastructure. "They took the gamble and kept flying it for us and that set us on our way."

Buying time on TDRS and a few other minor satellites is a serviceable option for now, though several projects are looking to step up their output over the next few years and will require more bandwidth, such as the South Pole Telescope's Cosmic Microwave Background program.

"At the South Pole, to accommodate the proposed next-generation Cosmic Microwave Background program, an increase in the total transmission rate by roughly a factor of five to around 1 [terabyte per] day in six to eight years would be required," according to a recent study by the National Academies of Sciences, Engineering and Medicine. "Although a modest increase by some standards — compared to Moore's law — it represents a challenge for [the U.S. Antarctic Program]."

NSF has already made some upgrades to the South Pole infrastructure on the ground (or ice), effectively doubling the ability to transmit out of the research centers. Unfortunately, the satellite system continues to age and soon researchers will need an alternative means of moving their data.

The Foundation is now in talks with the Department of Defense to use aging military satellites, specifically the Air Force's Defense Satellite Communications System. Smith said the agency is also looking at commercial options as the technologies mature and prices fall into acceptable ranges.

The government's appetite for moving and managing big data will be an important impetus for building the high-capacity infrastructure of the future.

Gregory Bell, director of the Energy Department's high-bandwidth Energy Sciences Network, or ESnet, summed it up best:

"Our vision for the world is that scientific discovery shouldn't be constrained by geography."

Aaron Boyd is an awarding-winning journalist currently serving as editor of Federal Times — a Washington, D.C. institution covering federal workforce and contracting for more than 50 years — and Fifth Domain — a news and information hub focused on cybersecurity and cyberwar from a civilian, military and international perspective.

Share:
In Other News
Load More