In today’s data centers, optimal usage of power is a driving factor that differentiates the leaders from the followers. Several techniques have come up in the last decade that help data center facilities plan their operations so as to use minimal power for achieving maximum results.
One of the new innovations in this area is the advent of enhanced DRAM memory technology. DRAM technology has become even more important today with most data centers offering virtualized environments. The server infrastructure needed for cloud and enterprise computing requires much more DRAM per server as compared to the traditional set ups of last year. A recent research conducted by leading industry vendors such as Samsung, Microsoft Technology Center (MTC), Fujitsu, Intel and others, revealed the following results:
The study demonstrated the benefits of using optimized 30nm class DRAM technology vs the traditional 50nm class DRAM technology. The results were measured by having two identical setups in terms of the CPU, fan, disk drive storage and power supply and comparing the throughput’s in each setup. It was interesting to note that the 30nm class DRAM technology proved to reduce power consumption by almost 20% in a virtualized environment.
If a data center has, say 1000 of these efficient server units, it can save its CO2 emissions to a range of 700 tons per year. The newer systems also lead to a considerable 60% saving in CRAC (Computer Room Air Conditioning). It was worth noting that the exhaust air for the newer systems was around 1.5 degree Celsius cooler than the older systems.
As you can see, keeping up with newer technology has its challenges in terms of costs of procurement and replacement, but it also has a multitude of benefits that make it worth the upgrade.
For more input on the impact of server and hardware technology on your data center performance, and before you make the choice of which data center is the best for your requirements, contact us at https://lifelinedatacenters.com.