In a recent thematic investment report, analysts at Barclays discussed the energy demand that comes with the rise of artificial intelligence (AI) technology, focusing specifically on NVIDIA’s (NASDAQ:) role in this area.
Projected energy demand tied to advances in AI highlights a key aspect of Nvidia’s market outlook, according to analysts.
According to a Barclays analysis, data centers could consume more than 9% of current U.S. electricity demand by 2030, with the power requirements of AI being a major driver. The analysts noted that “the power of AI embedded in the NVIDIA consensus” is one of the key factors behind this dramatic energy projection.
The report also notes that while AI efficiency continues to improve with each new generation of GPUs, the size and complexity of AI models are growing rapidly: For example, the size of major large language models (LLMs) is increasing by about 3.5x per year.
Despite these improvements, overall energy demand is expected to increase as the range of AI applications expands. Each new generation of GPUs, such as NVIDIA’s Hopper and Blackwell series, has become more energy efficient. Still, as AI models become larger and more complex, they require significant computational power.
“Large language models (LLMs) require enormous computational power for real-time performance,” the report states. “The computational demands of LLMs also translate into increased energy consumption as more and more memory, accelerators, and servers are needed to fit, train, and infer these models.”
“Organizations looking to deploy LLM for real-time inference will need to address these challenges,” Barclays added.
To give an idea of the scale of this energy demand, Barclays estimates that powering roughly 8 million GPUs would require roughly 14.5 gigawatts of electricity, equivalent to roughly 110 terawatt-hours (TWh) of energy. This projection assumes an average load factor of 85%.
By the end of 2027, roughly 70% of these GPUs will be deployed in the United States, equating to over 10 gigawatts and 75 TWh of AI power and energy demand in the U.S. alone within the next three years.
“NVIDIA’s market capitalization suggests this is just the beginning of the rollout of AI power demands,” the analysts said. The chipmaker’s ongoing development and deployment of GPUs will significantly increase overall data center energy consumption.
Furthermore, data centers rely on grid power, so meeting peak power demands is important. Data centers operate continuously, so a balanced power supply is required.
The report quotes OpenAI CEO Sam Altman as making a notable statement at the Davos World Economic Forum: “We need a lot more energy in the world than we previously thought. I don’t think we understand the energy demands of this technology yet.”