Fast-growing and energy-hungry, AI needs ever more electricity. Microsoft has just found a “clean” and sustainable solution with the announced revival of the infamous Three Mile Island nuclear power plant. But at what cost?
September 20, 2024 is a date to mark in the accelerated calendar of Artificial Intelligence. This Friday, the American energy company Constellation announced the restart of reactor 1 at the Three Mile Island power plant, power plant which suffered a historic nuclear accident in 1979 and has been closed since 2019.
This awakening is the consequence of a Unexpected deal between plant owner and Microsoft which thus wants to ensure the power supply of its data centers and more broadly of its needs linked to AI. This contract, for a period of 20 years, will therefore be reborn from its ashes, in 2028 – if the regulatory authorities validate the agreement – the spared part of the Pennsylvania plant which should then provide the multinational IT company with some 837 megawatts of carbon-free energy per year, i.e. the annual quantity required for powering a city like Atlanta.
Nuclear power, the answer to the ongoing need for low-carbon energy
With this agreement, Microsoft wants to meet the growing energy needs of its data centers, particularly those in Chicago, Virginia, Ohio and Pennsylvania.
“Powering industries critical to our nation’s global economic and technological competitiveness, including data centers, requires an abundance of reliable, carbon-free energy every hour of the day.Nuclear power plants are the only energy sources capable of consistently delivering on this promise. ", explained Joe Dominguez, CEO of Constellation, on Friday.
Bobby Hollis, Microsoft's vice president of energy, added:
"This agreement constitutes a major step in the efforts made by Microsoft For become carbon neutral, in accordance with our commitment.”
100 billion $ to power AI
The firm founded by Bill Gates also announced last week that it had reached another agreement with, among others, the asset manager BlackRock, to invest no less than 100 billion dollars in AI-related infrastructure: creation or extension of data centers but also “clean” energy producing infrastructures, with priority to renewable, always to meet the needs for decarbonized electricity.
In France, the computer scientist had already announced last May, during the "Choose France" summit, that it would invest 4 billion in its data centers to meet the needs of AI.
Global explosion in electricity consumption
This announcement serves as a reminder of the problem facing AI and the intensification of its use: its exponential need for energy. According to Bloomberg, the number of data centers, which notably power AI, more than doubled between 2015 and 2024, increasing from 3,500 worldwide to 7,100. The resulting energy consumption has literally exploded to reach 350 TWh in 2024, the equivalent of the annual electricity consumption of 52 million (Western) households! It is also much more than the production of all 56 French nuclear reactors. : 320 TWh over the whole of 2023.
Some experts predict that by 2026, data centers will capture the equivalent of Japan's electricity consumption! And according to Ami Badani, marketing manager at ARM, a leader in chip design, who sounded the alarm last April at a conference in London, the "insatiable demand for energy" will lead AI to consume, by 2030, the equivalent of a quarter of the electricity produced in the United States! A real global challenge that Microsoft is preparing for, which is clearly anticipating its need for energy independence. What will the others do? How many nuclear power plants to power all the AI?
Why AI is a power-hungry technology ?
Why so much need for electricity? According to the International Energy Agency (IEA), approximately 40% of the electricity consumed by data centers is used to power the servers when 40% is used to cool them!
Second essential reason for this energy inflation: Each query launched by a generative AI, for example, consumes ten times more energy than those launched via classic engines because it instantly solicits thousands of electronic chips. Hence, moreover, the call of certain experts to limit the use of AI versions ChatGPT, Dall-E, Gemini, Copilot…
Not to mention the learning phase of generative AIs which use thousands of electronic chips in hundreds of data centers over long periods of time to learn, understand, resolve...
Who will pay?
While some point out or predict that AI is already or will be saving energy, we are still very far from a positive balance. Especially since AI is only in its early stages and is thriving at the speed of light.
We therefore arrive at this paradox that in the context of extreme necessity to reduce the production of greenhouse gases (CO2): AI urgently imposes its solution: the resurgence of nuclear production
Microsoft's spectacular initiative is probably just the beginning. It also raises the question: who will ultimately pay for all these colossal investments, if not the end user?...
Suggested reading in connection with the article: Frugal AI, the first international benchmark for responsible and sustainable AI.