Agencies continue to seek savings in data center consolidation, but critical information could get lost in translation amid all the merging and closures. (AFP/Getty Images)
Data center optimization — an umbrella term for moving away from disparate networks and servers, and more toward efficient, cost-effective IT services — isn’t new anymore. Some federal guidance directing agencies to consolidate redundant data centers and seek cost savings through shared IT services are, at this point, half a decade old. Nonetheless, data centers continue to hold promise for government agencies.
Whether it’s millions of dollars in savings, increased security or opportunities to harness big data, the potential that optimization holds for federal IT operations keeps data centers in the headlines and at the center of policy decisions across the government. Some leaders are hopeful for the new capabilities that may emerge, but others warn that a cautious path forward is the best bet for staying as secure as possible.
The Defense Department is one of the highest-profile examples of agencywide data center consolidation. The Pentagon’s IT budget typically hovers around $40 billion annually, with the majority of that going toward maintaining legacy systems — including the department’s 15,000 network enclaves and hundreds of data centers.
Leaders say they have made progress in the pursuit of savings, security and more. For example, former DoD CIO Teri Takai told Congress in February that the department closed 244 data centers in the first quarter of fiscal 2014 alone.
“For the purposes of what we’re trying to get from those, it’s not really just about saving money,” said Mary Dixon, director of DoD’s Defense Manpower Data Center. “[U.S. Cyber Command] has said there is no way that I can properly deal with a cyberattack on the DoD network when I have so many hundreds of networks all over the place that I cannot possibly control. So part of the idea of consolidating is to try to reduce the number of networks and be able to take action more quickly in the event of a cyberattack.”
There is optimism about what can be achieved as data center optimization efforts continue — even beyond security and savings, Dixon pointed out. Case in point: putting big data to work for the Pentagon.
“Although we have a lot of structured data that we can do a lot of work with, there are some kinds of analyses where it would be really helpful if we could merge a lot of different kinds of data – unstructured and structured – together to get a look at some of the policy issues,” Dixon said. “Take recruiting – does the fact that we have a reduction in readiness impact whether people want to come into the military? We don’t know, we always look at it from the other perspective: As we lose our military people, we’re going to have a reduction in readiness. But maybe the question is, well now that we’ve reduced readiness do people look at the military and say, do we want to go into the military when it’s at that level? We don’t know, but maybe that’s a question we can answer with big data types of analysis.”
That isn’t to say, however, that data center consolidation doesn’t have potential drawbacks as well. Decision-makers must ensure that critical information does not get lost in translation amid all the merging and closures.
“We’ve been giving them advice as far as what we’re seeing as the data center consolidation is happening. It hasn’t changed — we still have the same three problems: confidentiality, availability and integrity of data as you consolidate,” said Ann Barron-DiCamillo, director of US-CERT at the Homeland Security Department. “In terms of incident data, you want to be able to point back to the original source because a lot of the time when you’re consolidating data you’re losing the context of why you have that data in the first place. So you really want to pay attention to all these different elements as you’re consolidating to make sure you’re not losing the intent of that data.”■