Abdul Salam | Cloud Tweaks

Global warming and climate change are on top of world’s list of concerns and one of the reasons is our dependence on dirty energy from fossil fuels. We all consider the transportation sector as the main pollution source of our atmosphere, but is the IT industry really exempted from the blame?

Governments around the globe usually have stringent standards on factory or industrial facility energy consumption and emission while the energy consumption in IT laboratories and data centers is overlooked, with the exception of some universities and research organizations. So there are no standards or laws that are meant to be followed when putting up such facility, which makes this a big problem. Research suggests that a great deal of energy is being wasted during energy conversion from AC to DC and it would cost twice as much, in terms of energy consumption, to cool a server than to run it.

Let’s say a server is rated at 500W and it runs 24/7; that server would consume 4380KWh per year. Now let’s assume that ten of those servers are running at the same time for every IT company with more than 1,000 employees. The estimated number of those companies given by the census bureau was 2916 in 2007; it could have grown exponentially within the last five years. This gives a rough value of 12,772MWh of energy consumption in 2007 alone, and twice of that value is used for cooling those servers. That is assuming all companies used the same 500W servers in the same way, but the real-world value could be even greater because there are still many old servers running. And not all of that hardware is being utilized; underutilization is the biggest waste of resource. This is a very big concern, especially since most of that energy is not renewable.

This is where cloud computing and virtualization come in to save the day or decade (maybe even the millennium). Cloud computing uses virtualization to scale resources to infinity, theoretically, while using finite hardware resources. So instead of having 2,916 data centers with 10 servers spread across the U.S., theoretically 2,916 companies could be served by 10 cloud providers running 100 data centers. This means a total annual energy consumption of just 43.8MWh. How’s that for energy savings? That does not even include the savings from cooling and other security measures.


Leave a reply

Your email address will not be published. Required fields are marked *


Copyright © 2024 xcluesiv.com All rights reserved

Log in with your credentials

Forgot your details?