High power, water demand threatens AI progress in US amid aging grid

Written by Sophia Feona Cantiller

Published 30 Jul 2024

Fact checked by Stephane Bandeira

NSFW AI

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure

Free Black Metal Current Posts Stock Photo

As artificial intelligence (AI) companies in the United States race to advance their models and technology, a major hurdle stands in their way as huge power demand amid an aging electric grid threatens to halt all progress.

In recent years, multiple reports have pointed out the increasing power and water consumption in data centers, which continue to grow in number as of writing. The servers, which allow AI products to operate, require high computing power to run and a large volume of water for cooling.

This has prompted many tech giants to look for strategies to improve computer efficiency to reduce power use. Google, Microsoft, Oracle, and Amazon already use Arm’s low-power process to minimize consumption by up to 15% in their data centers.

Likewise, Nvidia claimed that its AI chip can run models on 25% less power than its older versions.

However, despite these efforts, the energy crisis still remains—and has caused further harm to the environment.

More Data Centers, More Emissions

In 2019, the carbon dioxide emission of one large language model was estimated to be as much as that of five gas-powered cars in their whole lifetime.

Today, this number has seen an increase of close to 50%, partly because of energy consumption in data centers. Microsoft also released 30% more greenhouse gas from 2020 to 2024.  

Most of the 8,000 data centers worldwide are located in the US, and more will rise as AI rapidly develops. By 2030, these data centers are expected to share 16% of the total power consumption in the country, a huge leap compared to only 2.5% before ChatGPT was rolled out in 2022.

Vantage Data Centers is currently in demand for hyperscalers as its data centers have a capacity to use upward of 64 megawatts of power.

Exploring other power sources

While many tech firms turn to high-powered data centers to meet energy demands, others are exploring ways to produce electricity on-site.

OpenAI looks to harness energy from the sun after investing in a solar startup that creates large-scale modules equipped with panels and power storage. The ChatGPT maker also made investments in Oklo and Helion to leverage nuclear reactors for energy production.

Further, Google plans to utilize geothermal energy to power its data center, while Vantage Data Centers constructed a natural gas plant to run one of its data centers off the grid in Virginia.

Aging Grid

However, generating electricity is not the only problem faced by AI companies. Power transmission from generation sites to consumers has also been an issue due to the aging US grid.

A proposed solution to address this is adding more transmission lines, which would be time-consuming and expensive—and may result in higher electricity bills.

VIE Technologies has developed a better idea to attach small sensors on transformers, which will predict failures and power outages that may occur when transformers cannot handle large loads. This option sounds more feasible than replacing millions of transformers in the US, according to CEO Rahul Chaturvedi.

After Electricity, Now Water

In addition to all these challenges, the water shortage must also be resolved. Generative AI data centers need around 4.2 billion to 6.6 billion cubic meters of cooling water.

Vantage has found a workaround for this by using large air conditioning units to cool its data center in Santa Clara. Although, recently, the company discovered another solution by using liquid for direct-to-chip cooling.

“For a lot of data centers, that requires an enormous amount of retrofit. In our case at Vantage, about six years ago, we deployed a design that would allow for us to tap into that cold water loop here on the data hall floor,” a Vantage official said.

Meanwhile, Samsung, Qualcomm, and Apple have focused on enabling users to access on-device AI to keep their questions off the cloud and out of the companies’ data centers.