Advertisement

You will be redirected to the page you want to view in  seconds.

Colsolidation breeds optimization for data centers

Jun. 23, 2014 - 06:00AM   |  
By ADAM STONE   |   Comments
high performance computing (HPC) data center contr
The National Renewable Energy Laboratory recently overhauled its data centers so chilled water manages temperatures. (Dennis Schroeder / NREL)

When the Office of Management and Budget called for a drastic reduction in the number of government data centers, it opened a window of opportunity.

As government IT managers have consolidated their operations, they’ve taken advantage of the moment to optimize those streamlined data centers, re-engineering to get the greatest possible efficiencies out of these systems.

As the number of data centers shrinks, “we want to make sure they are operating in a way where the government is getting the most bang for the buck for its investment,” said Scott Renda, cloud computing and federal data center consolidation initiative portfolio manager at OMB. As engines of change, consolidation and optimization “go hand in hand,” he said.

BONUS: Former Defense Secretary Robert Gates will headline the Federal Innovation Summit 2014, titled “Modernizing Virtualized Data Centers,” on July 22. Click here for details .

Be cool, not cold

Optimization begins with refrigeration. Cooling accounts for 35 to 50 percent of data center power usage, according to various industry estimates. And yet many in the federal IT community argue that much of this energy is squandered.

The typical data center runs at 60 to 65 degrees: Cold enough that technicians will enter a room wearing extra layers if they plan to stay a while.

“It’s clearly not necessary,” said Steve Hammond, director of the National Renewable Energy Laboratory’s (NREL) Computational Science Center. He noted that most chips today have a safe operating limit of over 100 degrees.

NREL recently overhauled its own data centers. Rather than using cold air to cool processing rooms, the new system developed by HP uses chilled water to manage temperatures. Water cools components more efficiently, and the heat it picks up as it passes through the system is then pumped under the front plaza and walkway outside the building to help melt snow and ice, adding to the overall savings.

The State Department has likewise optimized its cooling systems, with an arrangement that takes advantage of the climate in Denver, the location of one of its data centers. The LEED Gold certified facility uses the area’s naturally cold temperatures to keep servers’ heat in check.

“The temperature there is cool most of the year, so for most of the year they can take in the outside ambient air misted with water, which reduces the energy load,” said Mark Benjapathmongkol, a project manager with State’s Bureau of Information Resource Management.

Reduce physical servers

Cooling is just one aspect of optimization. It’s just as important to reduce the heat output of various components, Benjapathmongkol said.

To do this, his organization has turned to virtualization, cutting the number of physical servers by consolidating processes onto virtual machines and thus reducing the number of servers by 30 percent.

Judging by industry’s experience, it may be possible for federal IT managers to take this method even further as they continue the consolidation of their centers. In the commercial world it is not unusual to see nine or 10 physical servers collapsed into one virtual machine, said Raymond Paquet, a managing vice president at Gartner. “It’s common today to see 60 to 70 percent of the workloads being virtualized.”

Virtualization has the advantage of not just lightening the cooling load, but also optimizing energy consumption. Beyond producing waste heat, data processing is in itself a resource-heavy operation. Reducing the number of physical machines inherently optimizes energy consumption.

Think of the big picture

Clever cooling mechanisms and infrastructure changes geared toward virtualization: Adjustments like these can help IT managers to get the biggest bang for government’s buck out of their data centers.

But to make the most of the opportunities presented by consolidation, some argue that the key component here is a willingness to step back, to view the data center not as a conglomeration of pieces by rather as the sum total of its parts.

“You need to take a holistic view of computing,” Hammond said. In re-engineering his organization’s assets, he pulled together a range of requirements including energy efficiency, power needs, cooling demands and a desire to reclaim waste heat.

To turn this broad, encompassing overview into facts on the ground, planners need to bring all relevant players together early on, including the IT team to build it, the facilities manager to run it and the finance specialists to pay for it all.

“What systems do you build, how will you operate the facility, where will the savings go – back to the IT budget, for example?” Hammond said. “You’ve got to get all three talking together from the start.” ■

More In Federal IT