OpenAI CEO Sam Altman praised the debut of Chinese competitor DeepSeek's AI model, calling it "invigorating to have a new competitor." In a social media post, Altman described DeepSeek’s R1 as “an impressive model, particularly in terms of what they deliver for the price.”
DeepSeek, a Chinese startup, is shaking up the AI industry, offering a chatbot that challenges traditional assumptions about the cost and energy demands of developing AI. The firm used around 2,000 Nvidia chips to develop its R1 model, a fraction of the computing power typically required for similar programs, which could significantly reduce both AI development costs and energy use for data centers.
The development comes at a time when AI’s energy demands have been increasing rapidly, with tech giants investing heavily in data centers to meet the growing needs of AI applications. Data centers are notorious for their massive energy consumption, but DeepSeek’s model offers a more power-efficient alternative that may disrupt the sector.
Following the announcement, energy stocks took a hit as investors feared that improved computing efficiency could reduce demand for energy from traditional sources. Constellation Energy, which had plans to build capacity for AI, saw its stock drop more than 20 percent.
Despite this, analysts like Morningstar’s Travis Miller caution that expectations about the efficiency of DeepSeek’s model might be overstated. He believes that data centers, reshoring, and the electrification of energy sources will remain significant drivers for energy demand.
Global energy consumption for data centers is projected to double by next year, with tech giants like Google, Microsoft, and Amazon already making significant investments in both nuclear and renewable energy sources to meet the demand. However, data centers still rely heavily on electricity grids, which are often dependent on fossil fuels.
DeepSeek’s model could lead to reduced energy consumption for AI if it replaces existing models like OpenAI’s. However, some experts, such as Andrew Lensen from Victoria University of Wellington, warn that technological efficiency improvements often result in greater demand, a phenomenon known as the Jevons paradox. As AI becomes more efficient and accessible, its use may skyrocket, turning it into an ever-growing commodity.
DeepSeek’s “chain-of-thought” model, which uses multiple steps to answer queries, is more energy-intensive than alternatives, but its efficiency improvements could make it more popular. Lensen speculates that U.S. companies might learn from DeepSeek’s efficiency and build even larger, more capable models without increasing energy usage.
Channelstv