So, it takes 2 watts of power in a data centre to cool computers that take 1 watt of power to operate (75% of which power is used on the computer’s own fan system).
How to design a way to use the heat from computers to heat buildings? It costs an estimated $4.5 billion in utilities to power US server farms. Do they relocate to Iceland which has a cooler ambient temperature?
Or, statistically, 30% of all data held on computers is redundant. How to design a system that turns off the computers guarding this data?