Companies

DeepSeek's Chatbot Innovation Raises Questions About AI Energy Consumption

Published January 29, 2025

The Chinese artificial intelligence startup, DeepSeek, has surprised markets and experts alike with its claim that it has developed a highly popular chatbot at a remarkably low cost compared to offerings from American tech giants.

This assertion raises important questions regarding the significant investments U.S. technology companies are making in expanding energy-intensive data centers that they claim are essential for advancing AI capabilities.

Could DeepSeek's chatbot indicate that the world requires much less energy for AI than previously thought? This question has significant implications for climate change, as the energy demands of AI are high, largely sourced from burning fossil fuels, which contributes to global warming. Many tech firms have reported increasing electricity consumption, contradicting their prior objectives to reduce their carbon footprint.

According to Eric Gimon, a senior fellow at Energy Innovation, "There has been a very gung ho, go ahead at all costs mentality in this space, pushing toward investment in fossil fuels. This is an opportunity to tap the brakes."

Experts believe that improving the energy efficiency of AI systems could reduce their environmental impact, even if the technology's overall electricity demand remains high.

DeepSeek's claims have driven people to its chatbot, propelling it to the top of the free app download list on Apple’s iPhone, surpassing American-developed chatbots like ChatGPT and Google’s Gemini.

Jay Woods, chief global strategist at Freedom Capital Markets, remarked, "All of a sudden we wake up Monday morning and we see a new player number one on the App Store, and all of a sudden it could be a potential gamechanger overnight. It caused a bit of a panic. These were the hottest stocks in the world.”

DeepSeek’s app performs well compared to leading AI models, capable of writing software code, solving complex math problems, and providing detailed explanations for its answers.

Leading analysts are examining DeepSeek's public research papers related to its R1 model, noting an eye-catching detail: the training cost for its flagship version, v3, is reported at just $5.6 million—significantly lower than the billions spent on systems like ChatGPT. DeepSeek has yet to respond to requests for further information.

It is important to note that the $5.6 million figure only covers the actual training of the chatbot and does not include earlier research and development costs. Additionally, DeepSeek has had to navigate U.S. export restrictions on advanced AI chips, relying instead on a lower-performance model from Nvidia that remains available for sale in China.

Concerns about energy consumption in data centers are mounting, with projections suggesting that their electricity use in the U.S. could double or triple by 2028. Currently, data centers account for about 4.4% of all U.S. electricity usage and are expected to rise to between 6.7% and 12% by that time, according to research from the Lawrence Berkeley National Laboratory.

Tech giants traditionally believe they must invest heavily in data centers and related infrastructure to develop and operate their AI systems. For instance, Meta Platforms plans to allocate up to $65 billion this year toward this goal, including a major data center complex in Louisiana. Similarly, Microsoft intends to invest $80 billion, while a consortium featuring CEOs of OpenAI, Oracle, and SoftBank announced plans to invest up to $500 billion in data centers and the necessary energy production for AI, starting with a project in Texas.

Experts argue that if AI systems are developed to be more energy-efficient, usage will increase, ultimately driving up energy demand. Vic Shao, founder of DC Grid, stated that when technology becomes both helpful and affordable, it attracts widespread usage.

Travis Miller, an energy and utilities strategist at Morningstar Securities Research, noted that while data centers will still be constructed, they might operate more effectively. He suggested that electricity demand growth may trend towards the lower end of predictions.

Should DeepSeek's claims prove accurate, it is conceivable that some routine AI tasks could shift from large data centers to smartphones, reducing the need for external computing power and allowing more time to invest in renewable energy sources for data facilities, according to Rahul Sandil from MediaTek.

However, the recent developments have adversely affected some AI-related stocks. Bloom Energy's CEO, KR Sridhar, emphasized the importance of the U.S. leading in AI advancements, particularly as it can power data centers using clean energy, unlike countries reliant on coal.

Industry analyst Rick Villars from IDC noted that while DeepSeek's news could reshape future AI development strategies, extensive data center and electricity requirements will still be present. He anticipates that this development could speed up the integration of AI technologies into various aspects of daily life, including work and healthcare.

AI, Climate, Energy