An interesting question just popped in my mind:
How much kWh was used to create the current Blockchain?
Should be somehow possible to calculate it based on the given mining hardware and mining difficulty.
Possible to
estimate, yes; but I dont see how it could be easy. That would require cost/efficiency estimates for every generation of miners: CPU (various optimized implementations and CPU models), GPU (ditto), FPGA, every generation of ASIC...
Does a collection of those historical data even exist anywhere? If you folks discussing this were to gather in the first instance, the effort may be worth a research paper. Maybe one already exists (with data up to some past point).
What people forget on these type calculations is all those that mined but didn't win a block, that is part of the calculations as that is what creates the dif.