AI is ‘an energy hog,’ but DeepSeek could change that
0 2 min 2 hrs

DeepSeek startled everyone last month with the claim that its AI model uses roughly one-tenth the amount of computing power as Meta’s Llama 3.1 model, upending an entire worldview of how much energy and resources it’ll take to develop artificial intelligence. 

Taken at face value, that claim could have tremendous implications for the environmental impact of AI. Tech giants are rushing to build out massive AI data centers, with plans for some to use as much electricity as small cities. Generating that much electricity creates pollution, raising fears about how the physical infrastructure undergirding new generative AI tools could exacerbate climate change and worsen air quality.

Reducing how much energy it takes to train and run generative AI models could alleviate much of that stress. But it’s still too early to gauge whether DeepSeek will be a game-changer when it comes to AI’s environmental footprint. Much will depend on how other major players respond to the Chinese startup’s breakthroughs, especially considering plans to build new data centers

“There’s a choice in the matter.”

“It just shows that AI doesn’t have to be an energy hog,” says Madalsa Singh, …

Read the full story at The Verge.

Leave a Reply

Your email address will not be published.