What has finally been realized about cloud computing is that this system does not save energy starting with what enables it: wireless connections. Without prejudice to the other advantages of the 'cloud', the waste of power of wireless Internet networks used to access cloud services from cellular networks is up to 10 times higher than that of common data centers.
What was already more than a suspect is a research carried out by the Center for Energy-Efficient Telecommunications (CEET) of the University of Melbourne: according to the researchers, the waste of power 9% of the total is attributable to data centers, while the remaining 91% depends on the 3G and 4G wireless networks used to access cloud computing systems.
Acquittal for the data center from the crime of waste power? Maybe not entirely and the guard always high on heat dispersion and cooling of data centers, even if in perspective for save energy it will be necessary to intervene on the boom in wireless connectivity (and in particular the heat production of mobile devices) linked to the spread of the cloud. Accepting that the real problem is not so much the upload / download speed but rather the consumption of power.
According to estimates by the Center for Energy-Efficient Telecommunications, the spread of cloud computing will not do save energy because it will boost the consumption of power electricity of wireless networks from 9.2 TWh calculated in 2009 up to 43 TWh expected in 2015 (we are talking about the amount of electricity produced in Italy throughout 2012). Translated into CO₂ emissions, it will be like putting 5 million new cars on the road (not electric of course).
Identifying with certainty in the wireless connections that enable cloud computing an important person responsible for energy waste is a good step forward, but what is the solution to save energy? The most obvious, giving up wireless internet, is also the least viable. Only one remains: to improve the energy efficiency of technological infrastructures.