Surging AI Demand Could Quadruple US Data Center Power Needs by 2030, Prompting New Energy Infrastructure Push

· · Views: 1,887 · 3 min time to read

As artificial intelligence infrastructure grows quickly, it is set to change how much energy the U.S. uses in the next decade. Data centers will likely need much more electricity as AI becomes more common.

This AI boom is putting a strain on the power system, with utilities feeling more pressure as demand from computing facilities rises and the AI boom faces an electric shock.

U.S. AI data centers could use four times more power by 2030. This shows just how much infrastructure will be needed to support future computing demands.

New Institute to Address AI Energy Demands

To address the growing energy demands, a U.S. national laboratory has started a major project to handle the rising electricity needs of AI-powered facilities.

Interesting Engineering reported that Oak Ridge National Laboratory (ORNL) has announced the creation of the Next-Generation Data Centers Institute (NGDCI), which aims to develop infrastructure capable of supporting future AI systems without compromising reliability or efficiency.

The institute will bring together ORNL’s knowledge in energy, computing, grid science, and cybersecurity to create systems that are secure, efficient, and reliable.

Electric Power Research Institute, says data centers already use more than 4% of U.S. electricity, and this could reach 17% by 2030.

AI Workloads Are the Primary Driver

Most of this growth comes from artificial intelligence operations. Training and running large AI models needs a lot of computing power, which leads to high electricity use.

The NGDCI project plans to solve these problems by working on cooling systems, power management, and grid operations. The goal is to keep future AI infrastructure sustainable.

Reuters also reported that more AI facilities are causing electricity use to rise across the country, as new data centers are built to handle more complex models.

Balancing Innovation With Grid Stability

The growth of AI infrastructure brings both opportunities and risks. Utilities now have to figure out how to meet higher demand without making the grid unstable, especially as more data centers and generative AI projects are built.

NGDCI’s mission is to make sure new facilities meet long-term needs for grid reliability and security.

A Growing Infrastructure Imperative

As more people use AI, the link between computing infrastructure and energy systems is becoming more important.

Rising demand and new projects like NGDCI show that there is a growing effort to match technology advances with real energy needs. This challenge will likely shape the future of both AI and power infrastructure.

Share
f 𝕏 in
Copied