Surging interest in artificial intelligence systems will add further strain to global electricity grids with the potential to rival the massive energy consumption of bitcoin. Thankfully, the premier cryptocurrency has shown us a way to mitigate the impact.

A doubling of data center revenue at Nvidia last quarter shows that demand for generative applications like ChatGPT hasn’t yet hit its peak. The U.S. chipmaker is the key provider of shovels in this AI goldrush, but those processors are neither cheap nor lean. Its latest flagship, the GH200 Grace Hopper Superchip, which is the size of a postcard, draws up to 1,000 watts — equivalent to a portable heater.

Though most customers will be opting for something less fancy than the Superchip, they do buy processors in bulk to connect together into a massive AI server and that is where the hunger for electricity really kicks in. One study published last year looked at the energy consumption required to train a single large language model used to output text in multiple languages.