I kinda hate these articles about the carbon cost of compute- it's utterly short sighted, gpt3 is infrastructure that creates huge efficiencies - the cost is a bargain. Not to mention this is like early days and we're driving down the costs everyday.
Isn't it the case that training a ML model consumes vastly more power than using the trained model for inference?
If so, the problem may be somewhat overstated. Training is a dispatchable power load. It can be scheduled for periods of surplus power in regions where surplus power is periodically abundant. Training a ML model could literally follow the sun.
Has someone already made a cryptocurrency that requires that miners train an AI to demonstrate proof of work?
Energy source > Energy consumption.
Energy consumption is not a bogeyman and we'll need to consume vastly more energy if we want to make people richer, healthier and happier.
We need to pivot to cleaner sources, so the focus should be on that.
Would be great if the journos could stop demonizing the wrong thing.