T4K3.news
GPT-5 energy use raises questions about power needs
Experts say GPT-5 consumes more energy than prior models; benchmarking and disclosure are urged to inform future decisions.

Experts benchmark AI energy use warn that GPT-5 larger size and multimodal features drive higher power demands.
GPT-5 energy use exceeds earlier models
Researchers at the University of Rhode Island measured GPT-5 energy use. They report an average draw of about 18 watt hours for a medium length response and as high as 40 watt hours for roughly 1000 tokens. The results indicate GPT-5 uses more energy than earlier models, a pattern expected as models grow larger.
Experts say the energy cost depends on the architecture; GPT-5 uses a mixture of experts to limit activation, which can reduce usage on some queries. But the model also handles reasoning and multimodal tasks, which tends to increase computation time and energy. OpenAI has not published official energy figures for GPT-5, prompting calls for clearer disclosure and for measuring environmental impact.
Key Takeaways
"A more complex model like GPT-5 consumes more power both during training and during inference"
Quoted from a researcher explaining the link between model size and energy use
"If you use the reasoning mode, the amount of resources you spend for getting the same answer will likely be several times higher, five to 10"
Quoted from a collaborator discussing the extra cost of reasoning
"We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5 environmental impact"
Call for openness and accountability
"It is more critical than ever to address AI’s true environmental cost"
Editorial push for action
The energy cost of AI scales with size and capability. GPT-5 embodies a prominent tension between ambition and cost. The industry will need credible benchmarks and transparent reporting to avoid eroding public trust.
There is a broader policy conversation ahead. Investors and regulators may push for standardized energy metrics, which could influence how future models are designed and deployed. As the field grows, the environmental footprint will shape both public perception and long term strategy for AI labs.
Highlights
- Big models demand big energy, a math we cannot ignore
- Transparency on power use must keep pace with ambition
- If AI grows, electricity bills will grow with it
- Greener computing must be built in from day one
Environmental cost and transparency concerns around AI energy use
GPT-5 energy consumption highlights the environmental footprint of large AI models and the need for transparent benchmarks. As models grow, public and investor scrutiny may rise and policy responses could follow.
As AI grows, clear accounting of energy cost becomes a standard part of responsible innovation.
Enjoyed this? Let your friends know!
Related News

Amazon stock drops 8.3% after disappointing earnings

OpenAI begins podcast discussing ChatGPT's future and plans

Elon Musk launches Grok 4 amid ethical concerns

AI tools expand in workplaces

UK secures multibillion-pound deal for Sizewell C

OpenAI launches GPT-5 AI chatbot

OpenAI launches GPT-5

Alaska's education budget slashed amidst safety concerns
