Innovations for data centers, which consume massive amounts of natural resources to operate, have often focused on efficiencies to lessen their environmental footprint. While those innovations are leading to decreased energy consumption, the government recently released a report that tries to quantify just how much water the nation’s data centers are using.
According to DataCenterKnowledge.com, the study primarily focused on energy consumption at data centers. However, it also took into account the electricity that’s generated as part of the water’s function in powering and cooling data centers.
Here are some findings in the Department of Energy report:
– Data centers use more water to generate the electricity that keeps them running than they use water to cool them.
– In all, data centers throughout the United States consumed a combined 626 billion liters of water in 2014, a number that is projected to hit 660 billion by 2020.
– About 1.8 liters of water is used by the average data center for every 1kWh it consumes, compared to an average of 7.6 liters of water to generate 1kWH of energy in other uses throughout the United States.
– Data center water consumption has been growing at a slower rate than it had in 2007.
– Large Internet companies like Amazon, Google and Facebook are increasingly responsible for developing innovations for energy consumption as they consume a significantly large portion of total data center capacity.
The DOE study was developed in collaboration with Stanford University, Northwestern University, and Carnegie Mellon University.
Want to learn why EMP shielding, FedRAMP certification, and Rated-4 data centers are important?
Download our infographic series on EMP, FedRAMP, and Rated-4!