OpenAI Chat GPT Will Save The World, But Altman And Company Won't Release Energy Use Of Updated Versions
In mid-2023, if a user asked OpenAIs ChatGPT for a recipe for artichoke pasta or instructions on how to make a ritual offering to the ancient Canaanite deity Moloch, its response might have taken very roughly 2 watt-hours, or about as much electricity as an incandescent bulb consumes in 2 minutes. OpenAI released a model on Thursday that will underpin the popular chatbot GPT-5. Ask that version of the AI for an artichoke recipe, and the same amount of pasta-related text could take several times even 20 times that amount of energy, experts say.
As it rolled out GPT-5, the company highlighted the models breakthrough capabilities: its ability to create websites, answer PhD-level science questions, and reason through difficult problems. But experts who have spent the past years working to benchmark the energy and resource usage of AI models say those new powers come at a cost: a response from GPT-5 may take a significantly larger am
OpenAI, like most of its competitors, has released no official information on the power usage of its models since GPT-3, which came out in 2020. Sam Altman, its CEO, tossed out some numbers on ChatGPTs resource consumption on his blog this June. However, these figures, 0.34 watt-hours and 0.000085 gallons of water per query, do not refer to a specific model and have no supporting documentation.
EDIT
n its benchmarking study in July, which looked at the power consumption, water usage and carbon emissions for Mistrals Le Chat bot, the startup found a one-to-one relationship between a models size and its resource consumption, writing: A model 10 times bigger will generate impacts one order of magnitude larger than a smaller model for the same amount of generated tokens. Jegham, Kumar and Ren said that while GPT-5s scale is significant, there are probably other factors that will come into play in determining its resource consumption. GPT-5 is deployed on more efficient hardware than some previous models. GPT-5 appears to use a mixture-of-experts architecture, which means that it is streamlined so that not all of its parameters are activated when responding to a query, a construction which will likely cut its energy consumption.
EDIT
https://www.theguardian.com/technology/2025/aug/09/open-ai-chat-gpt5-energy-use