OpenAI CEO Sam Altman dismissed concerns that artificial intelligence is using up lots of water as “fake” – arguing that “humans use energy, too.”
The billionaire tech founder responded to claims widely circulated online that OpenAI’s ChatGPT uses gallons of water to spit out responses to simple queries.
Speaking on the sidelines at the India AI Impact summit on Friday, Altman called those claims about water usage “completely untrue, totally insane,” adding they have “no connection to reality,” according to the Indian Express.
Altman also pushed back on previous comments by Microsoft founder Bill Gates, who suggested that the efficiency of the human brain implies that AI can also become more energy efficient over time.
“One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model,” Altman said. “But it also takes a lot of energy to train a human.”
“It takes like 20 years of life, and all the food you eat before that time, before you get smart.”
He continued: “The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way.”
Though he dismissed concerns about water usage, Altman said fears around overall energy consumption are fair.
“Not per query, but in total – because the world is using so much AI … and we need to move towards nuclear or wind and solar very quickly,” he said.
By 2023, soon after the release of ChatGPT, electricity consumption by global data centers had already reached levels comparable to the entirety of Germany or France, according to a May report by the International Monetary Fund.
Altman quickly faced online backlash over his comments, as fellow techies blasted the comparison of humans to AI bots.
“I do not want to see a world where we equate a piece of technology to a human being,” billionaire Sridhar Vembu, co-founder of Indian software firm Zoho Corporation, wrote in a post on X following the summit.
Tech companies have been pouring billions of dollars into artificial intelligence, and governments are following suit.
President Trump recently unveiled a “tech corps” to spread American AI overseas and recruit and train engineers.
Last year, the Trump administration unveiled the Stargate project with an initial investment of $100 billion to build massive data centers throughout the US, though progress has reportedly been slow.
Environmental groups have been pushing back against government attempts to speed up the approval processes for data center construction.
Local communities have also been protesting new data centers in their neighborhoods over concerns the infrastructure will strain electricity grids, raise costs and pressure nearby water systems.
A proposed $1.5 billion data center project was struck down by the City Council in San Marcos, Texas, last week following public outcry.
Power-hungry data centers require large amounts of water to cool electrical systems and prevent overheating and fires – using roughly one bottle of water to respond to each 100-word AI prompt, according to scientists at the University of California, Riverside.
A mid-sized data center consumes about 300,000 gallons of water a day – roughly the same amount as 1,000 US households, according to a paper by research scientist Arman Shehabi at Lawrence Berkeley National Laboratory.
Some newer data centers don’t rely on water usage at all – but despite the advancement, the amount of water used for cooling purposes is expected to more than triple over the next 25 years as computing demand soars, according to a report last month from Xylem and Global Water Intelligence.



No comments:
Post a Comment
Note: Only a member of this blog may post a comment.