When blade technologist Ken Baker takes a look around the industry, he sees a lot of pieces in place already to make data centers energy efficient.
Chip makers are rolling out multicore processors that offer higher performance without spikes in power consumption, OEMs are using a combination of hardware and software features to make their servers more efficient and companies that make power supplies are rolling out products designed to save users money on energy costs.
What Baker isn't seeing is widespread adoption of these technologies by users, or data center designers making energy efficiency a priority.
"One bit of responsibility I want to throw back into the industry is on data center designers themselves," Baker, a blade system infrastructure technologist with Hewlett-Packard, said Aug. 14 during a panel discussion on green computing at an event here hosted by Advanced Micro Devices. Most facilities are designed based on data from the previous three to five years, he said, "and if you design it that way, you will get 3- to 5-year-old results."
All the panelists at the event were members of the Green Grid, a nonprofit consortium of industry players that is looking to curb the IT industry's ravenous power consumption.
Bruce Taylor, chief analyst at the Uptime Institute and moderator of the panel, spoke of an "economic meltdown of Moore's Law," saying that the energy efficiency of technology is not keeping pace with the growth in processing power.
PointerClick here to read more about what hardware designers can do to reduce data center energy consumption.
"There is clearly a thermal density problem and a power problem in the data center," Taylor said.
He outlined the issue with numbers that are becoming familiar to an industry in which the costs of powering and cooling technology threaten to outstrip the cost of the technology itself. The IT industry currently consumes 1.5 percent of the energy in the United States, and that is expected to grow to 3 percent by 2010, making the industry the second largest industrial consumer of power, behind heavy manufacturing.
The price of building a new data center has quadrupled since 2000, in large part due to the costs associated with the infrastructure needed to power and cool the building. And Taylor said that in a report issued by the Aperture Research Institute in 2006, 40 percent of respondents said they had run out of space, power or cooling capacity in their data centers without having had sufficient warning.
John Tuccillo, marketing director at American Power Conversion, said that as much as 47 to 51 percent of the power going into data centers is used to cool the facilities, rather than going to the computing resources themselves.
The issue is getting attention at the federal level. The Environmental Protection Agency issued a report to Congress on Aug. 2 outlining the energy challenges facing the IT industry and steps that should be taken to mitigate the problems.
The panelists agreed that vendors need to work together to create a holistic approach to addressing the issue, a key driver behind the creation of the Green Grid in April 2006. In addition, they touted the numerous products and features that already offer enterprises ways to reduce their energy costs, from virtualization to multicore processors.
Baker spoke of HP's Dynamic Smart Cooling software, which offers dynamic management of power and cooling in data centers and is a technology that HP is implementing in the six new data centers it's building in the United States.
Return to internet news headlines
View Internet News Archive