
OpenAI CEO Sam Altman isn’t worried about AI’s increasingly glaring resource consumption, and argued humans require a lot too.
In an on-stage interview at the India AI Impact summit, he went on the defensive after he was asked about ChatGPT’s water needs.
He dismissed claims that the chatbot uses gallons of water per query as “completely untrue, totally insane,” according to a clip posted by The Indian Express, explaining that data centers powering ChatGPT have largely moved away from water-heavy “evaporative cooling” to prevent overheating.
Altman was then asked about the electricity needed for AI. In contrast to the issue of water, he claimed it was “fair” to bring up the technology’s energy requirements, saying “We need to move toward nuclear, or wind, or solar [energy] very quickly.”
But he pointed out that comparing AI’s power needs to humans isn’t exactly apples to apples.
“It also takes a lot of energy to train a human,” he said, prompting some in the crowd to laugh. “It takes, like, 20 years of life, and all of the food you eat during that time before you get smart.”
Altman expanded even further by noting that today’s humans wouldn’t even be here were it not for their ancestors dating back hundreds of thousands of years to when modern humans first emerged.
“Not only that, it took, like, the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science or whatever to produce you,” he added.
When comparing humans to ChatGPT’s potential, you have to take this context into account, he argued. A fair comparison would be to pit the energy a human uses to answer a query with an AI after it is trained. On that measure “probably, AI has already caught up on an energy efficiency basis measured that way.”
In a June 2025 blog post, Altman claimed each ChatGPT query takes about 0.34 watt-hours of electricity, or around what an oven uses in about a second. Still, he published this fact before OpenAI released its newest GPT-5 model and its subsequent upgrades. Energy consumption can also vary based on the complexity of a query, for example, answering a question versus creating an image.
Experts have warned that AI as a whole will increase its cumulative power and water consumption greatly over the next 20 years or so. Overall, AI’s water usage is set to grow by about 130%, or by about 30 trillion liters (7.9 trillion gallons) of water through 2050, according to a January report by water technology company Xylem and market research firm Global Water Intelligence.
Over that same period, rising electricity demands are expected to increase the water use for data centers’ power generation by about 18%, reaching roughly 22.3 trillion liters (5.8 trillion gallons) per year. Meanwhile, the ever more complex chips data centers use will need more water during the manufacturing process, which will skyrocket the amount they require by 600% to 29.3 trillion liters (7.7 trillion gallons) annually from about 4.1 trillion liters (1.8 trillion gallons) today.
While OpenAI has moved away from evaporative cooling, 56% of all data centers globally still use the method in some form, according to the Xylem and Global Water Intelligence report.
OpenAI’s own 800-acre data center complex in Abilene, Texas will reportedly use water, albeit, in a more efficient, closed-loop system that continuously recirculates water to cool the data center, the Texas Tribune reported. The data center will initially use 8 million gallons of water from the city of Abilene to fill its cooling system.
#Sam #Altman #defensive #AIs #power #usage #takes #lot #energy #train #human