By Patrick Thibodeau, Computerworld | August 26th, 2014
IT managers may be too cautious about managing power and businesses are unwilling to invest in efficiency, study finds.
U.S. data centers are using more electricity than they need. It takes 34 power plants, each capable of generating 500 megawatts of electricity, to power all of the data centers in operation today. By 2020, the nation will need another 17 similarly sized power plants to meet projected data center energy demands as economic activity becomes increasingly digital.
Any increase in the use of fossil fuels to generate electricity will result in an increase in carbon emissions. But added pollution isn’t an inevitability, according to a new report on data center energy efficiency from the National Resources Defense Council (NRDC), an environmental action organization.
Nationwide, data centers in total used 91 billion kilowatt-hours of electrical energy in 2013, and they will be using 139 billion kilowatt-hours by 2020 — a 53% increase.
This chart shows the estimated power usage (in billions of kilowatt-hours), and the cost of power used, by U.S. data centers in 2013 and 2020, and the number of power plants needed to support the demand. The last column shows carbon dioxide (CO 2) emissions in millions of metric tons. (Source: NRDC)
The report argues that an improvement in energy efficiency practices by data centers could cut energy waste by at least 40%. The problems hindering efficiency include “comatose” servers, also known as ghost servers, which use power but don’t run any workloads; overprovisioned IT resources; lack of virtualization; and procurement models that don’t address energy efficiency. The typical computer server operates at no more than 12% to 18% of capacity, and as many as 30% of servers are comatose, the report states.
The paper tallies up the consequences of inattention to data center energy efficiency on a national scale. It was assembled and reviewed with help from several organizations, including Microsoft, Google, Dell, Intel, The Green Grid, Uptime Institute and Facebook — all of which made “technical and substantial contributions.”
The NRDC makes a sharp distinction between large data centers run by large cloud providers, which account for about 5% of all data center energy usage, and smaller, less-efficient facilities. Throughout the industry, there are “numerous shining examples of ultra-efficient data centers,” the study notes. These aren’t the problem. It’s the thousands of other mainstream business and government data centers, and small, corporate or multi-tenant operations, that are the problem, the paper argues.
The efficiency accomplishments of the big cloud providers “could lead to the perception that the problem is largely solved,” said Pierre Delforge, director of the NRDC’s high-tech sector on energy efficiency, but that perception doesn’t match reality when all data centers are taken into account.
Data centers are “one of the few large industrial electricity uses which are growing,” Delforge said, and they are a key factor in creating demand for new power plants in some parts of the country.
Businesses that move to colocation, multitenant data center facilities don’t necessarily make efficiency gains. Customers of such facilities may be charged according to a space-based pricing model, paying by the rack or by square footage, with a limit on how much power they can use before additional charges kick in. That model offers little incentive to operate equipment as efficiently as possible.
In total, the report says, U.S. data centers used 91 billion kilowatt-hours of electricity last year, “enough to power all of New York City’s households twice over and growing.” By 2020, annual data center energy consumption is expected to reach 140 billion kilowatt-hours.
If companies adopted data center best practices, the report states, the economic benefits would be substantial. A 40% reduction in energy use, which the report says is only half of the technically possible reduction, would equal $3.8 billion in savings for businesses.
The report also finds that energy efficiency progress is slowing. Once the obvious efficiency projects, such as isolating hot and cold aisles, are completed, additional investment in energy efficiency becomes harder to justify — either because of the cost or because of a perception that new initiatives might increase risk. IT managers are “extremely cautious” about implementing aggressive energy management programs because they’re concerned that such measures could threaten uptime, the report notes.
There are a number of measurements used to guage the efficiency of data centers, and the report recommends development of tools for determining CPU utilization, average server utilization and average data center utilization. It says that “broad adoption of these simple utilization metrics across the data center industry would provide visibility on the IT efficiency of data centers, thereby creating market incentives for operators to optimize the utilization of their IT assets.”
The NRDC isn’t the first to look at this issue. In 2007, the U.S. Environmental Protection Agency, working with a broad range of data center operators and industry groups, released a report on data center power usage that found that the energy use of the nation’s servers and data centers in 2006 “is estimated to be more than double the electricity that was consumed for this purpose in 2000.” It called for energy efficiency improvements.