Penn Engineers have developed a new chip that uses light waves, rather than electricity, to perform the complex math essential to training AI. The chip has the potential to radically accelerate the processing speed of computers while also reducing their energy consumption.
Computers are starting to use staggering amounts of electricity. There is a trade-off here between the utility of the tasks they perform and the climate damage caused by generating all the electricity they need. Bitcoin mining is thought to be currently using 2% of America’s electricity and seems an especially egregious waste of energy.
Radically diminishing computer’s electricity requirements as they become more powerful should be seen as an urgent task.
Aren’t modern computers taking way less energy than before per work? We just keep using more of it faster than the energy use decreases?
Yes and yes
It has an even larger load in China. The invention of cryptocurrency has been a blight on the environment.