Search This Blog

Saturday, September 30, 2023

Energy consumption 'to dramatically increase' because of AI

 Artificial intelligence is expected to have the most impact on practically everything since the advent of the internet. Wall Street sure thinks so. The tech-heavy Nasdaq (^IXIC) is up 26% year to date thanks to the frenzy over AI-related stocks.

But AI's big breakout comes at a cost: much more energy.

Take for example OpenAI's chatbot ChatGPT. Research from the University of Washington shows that hundreds of millions of queries on ChatGPT can cost around 1 gigawatt-hour a day, or the equivalent of the energy consumption of 33,000 US households.

"The energy consumption of something like ChatGPT inquiry compared to some inquiry on your email, for example, is going to [be] probably 10 to 100 times more power hungry,” Sajjad Moazeni, professor of electrical and computer engineering at UW, told Yahoo Finance.

Industry participants say this is only the very beginning of what's to come.

“We’re maybe at 1% of where the AI adoption will be in the next two to three years,” said Arijit Sengupta, founder and CEO of Aible, an enterprise AI solution company. “The world is actually headed for a really bad energy crisis because of AI unless we fix a few things.”

EAGLE MOUNTAIN, UT - OCTOBER 05: Construction proceeds on phases three through five at a Facebook data center on October 5, 2021 in Eagle Mountain, Utah. Facebook was shut down yesterday for more than seven hours reportedly due in part to a major disruption in communication between the company's data centers.  (Photo by George Frey/Getty Images)
An energy-hungry Facebook data center under construction. (George Frey/Getty Images)

Data centers are the heart of the advanced computing process. They are the physical locations with thousands of processing units and servers at the core of the cloud computing industry largely managed by Google, Microsoft, and Amazon.

"As you think of this shift towards these larger foundation models, at the end of the day you’re going to need these data centers to require a lot more energy as a whole," Angelo Zino, VP and senior equity analyst at CFRA Research, told Yahoo Finance.

Data centers have increasingly shifted from using simpler processors, called CPUs, to more advanced graphics processing units, or GPUs. Those components, made by companies like Nvidia (NVDA), are the most energy intensive.

"For the next decade, GPUs are going to be the core of AI infrastructure. And GPUs consume 10 to 15 times the amount of power per processing cycle than CPUs do. They’re very energy intensive,” explained Patrick Ward, vice president of marketing for Formula Monks, an AI technology consulting company.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.