DeepSeek startled everyone last month with the claim that its AI model uses roughly one-tenth the amount of computing power as Meta’s Llama 3.1 model, upending an entire worldview of how much energy and resources itâll take to develop artificial intelligence.Â
Taken at face value, that claim could have tremendous implications for the environmental impact of AI. Tech giants are rushing to build out massive AI data centers, with plans for some to use as much electricity as small cities. Generating that much electricity creates pollution, raising fears about how the physical infrastructure undergirding new generative AI tools could exacerbate climate change and worsen air quality.
Reducing how much energy it takes to train and run generative AI models could alleviate much of that stress. But itâs still too early to gauge whether DeepSeek will be a game-changer when it comes to AIâs environmental footprint. Much will depend on how other major players respond to the Chinese startupâs breakthroughs, especially considering plans to build new data centers.
âThere’s a choice in the matter.â
âIt just shows that AI doesn’t have to be an energy hog,â says Madalsa Singh, …