Feasibility and energy spend your code provokes. What?

GoogZilla

Dear Googy (aka Google), probably has more than 2.5 millions of servers in its 21 data centres. I have extrapolated. In 2011 it was approx 0.9 mil. I was conservative.


Let us assume one software component can spend 0.1 milliwatts per 24 hours. Let’s assume it is placed on each of them  2.5 mils of servers. Why 0.1 milliwatts? That’s another extrapolation on top of the measurement of 7 wats per 41 minutes, for a server made of 3 executables. We are talking here one component, let’s name it “one shared library”.


0.1 milliwatts * 2 500 000 * 7 * 52 = 91 000 watts per year


That is for one unfortunate component.  So if we cut its consumption in half we have saved some money to dear Googy. And we have spent approx 41 000 watts per year less.

Now. Do we all understand now what is: “feasibility”, in the context of the energy, spend of the IT? To that one future feasible component, someone has applied feasible architecture, design and implementation. Someone was thinking a lot about the feasibility in the context of energy spend.

It is a WATT, not What.

Further reading:

Jagroep, E., van der Werf, J.M., Brinkkemper, S. et al Extending software architecture views with an energy consumption perspective“. Computing 99, 553–573 (2017). https://doi.org/10.1007/s00607-016-0502-0