Artificial intelligence is hitting a wall on energy, and as models increase, training them may soon require energy outputs like nuclear reactors, according to Akash Network founder Greg Osuri.
In an interview with Cointelegraph’s Andrew Fenton at Token2049 in Singapore, Osuri said the industry underestimates how fast compute demands are multiplying and their environmental costs. He noted that data centers already consume hundreds of megawatts of fossil fuel power.
Osuri warned the trend could trigger an energy crisis, raising household power bills and adding millions of tons of new emissions each year.
“We’re getting to a point where AI is killing people,” he said, pointing to health impacts from concentrated fossil fuel use around data hubs.
How decentralization could mitigate AI’s power problem
On Tuesday, Bloomberg reported that AI data centers are sending power costs surging in the US.
The report highlighted how data centers have contributed to the rising energy bills of everyday households. According to the report, wholesale electricity costs have risen 267% in five years in areas near data centers.
Osuri told Cointelegraph that the alternative is decentralization. Instead of concentrating chips and energy in single mega-data centers, Osuri said that distributed training across networks of smaller, mixed GPUs — ranging from high-end enterprise chips to gaming cards in home PCs — could unlock efficiency and sustainability.
“Once incentives are figured out, this will take off like mining did,” he said, adding that home computers may also eventually earn tokens by providing spare compute power.
This vision bears similarities to the early days of Bitcoin (BTC) mining, where ordinary users could contribute their processing power to the network and get rewarded in return. This time, the “mining” would be training AI models instead of crunching cryptographic puzzles.
Osuri said this could give everyday people a stake in the future of AI while lowering costs for developers.
Related: Nansen unveils AI agent for crypto traders, targets autonomous trading in Q4
Not without its challenges
While its potential is undeniable, Osuri said challenges still exist. Training large-scale models across a patchwork of different GPUs requires technological breakthroughs in software and coordination. He said this is an issue that the industry is only beginning to crack.
“About six months ago, several companies started demonstrating several aspects of distributed training,” Osuri said.
“No one has put all those things together and actually run a model.” He added that this could change “by the end of the year.”
Another hurdle is creating fair incentive systems. “The hard part is incentive,” Osuri said. “Why would someone give their computer to train? What are they getting back? That’s a harder challenge to solve than the actual algorithm technology.”
Despite these obstacles, Osuri insisted that decentralized AI training is a necessity. By spreading workloads across global networks, he said AI could ease pressure on energy grids, cut carbon emissions and create a more sustainable AI economy.
Magazine: Growing numbers of users are taking LSD with ChatGPT: AI Eye