Concerns about data centers demanding a larger share of energy have been unfounded, as a new report shows that innovations in the industry have led to a promising stabilization in energy usage.
Although the United States is seeing an increasing number of data centers, they are more energy-efficient than models of the past, according to the Department of Energy's Lawrence Berkeley National Laboratory. As a result, their collective energy consumption has grown less than previously anticipated.
Based on the study’s results, data centers consumed about 70 billion kWh in 2014 — about 1.8 percent of the total electricity used in the United States. That’s about a 4 percent from 2010. During a similar time span, from 2005 to 2010, the industry experienced a 24 percent increase in consumption.
The current low growth rate will continue through 2020, according to the U.S. Data Center Energy Usage Report, which covered a period from 2010-2020. On the other hand, data center installation is expected to grow by 40 percent during that time period.
"Over that decade, the amount of energy savings is about 620 billion kilowatt-hours, or more than $60 billion, thanks to efficient practices,” said Berkeley Lab researcher Arman Shehabi said in an article for Phys.org.
Reasons for increased energy efficiency
The factors that contributed to the increased energy efficiency centered around cooling, the cloud, and fewer idling servers, according to the researchers.
Shehabi noted that data centers of the past relied on air conditioning that blasted indiscriminately to keep the equipment cool. "Now there are more advanced cooling strategies, such as hot aisle isolation, economizers, and liquid cooling, which all make the cooling process far less energy intensive," he said.
Also, companies have been consolidating equipment, using fewer servers, as well as virtualization through the use of cloud services, which has contributed to less dependency on energy consumption.