close
close
Technology leaders are looking for energy alternatives as AI energy consumption increases

  • AI models consume enormous amounts of energy and increase greenhouse gas emissions.
  • Tech companies and governments say an energy revolution must take place to keep up with the pace of AI development.
  • Many AI leaders are championing nuclear energy as a possible solution.

Advances in AI technology are sending shockwaves through the power grid.

The latest generation of large language models requires significantly more computing power and energy than previous AI models. As a result, technology leaders are committed to accelerating the energy transition, including investing in alternatives such as nuclear energy.

Major technology companies have committed to advancing net zero targets in recent years.

Meta and Google aim to achieve net zero emissions across all operations by 2030. Likewise, Microsoft aims to be “carbon negative, water positive and zero waste” by 2030. Amazon aims to achieve net carbon zero across all operations by 2040.

Major technology companies, including Amazon, Google and Microsoft, have also recently struck deals with nuclear energy suppliers to advance AI technology.

“Energy, not computing power, will be the biggest bottleneck to AI progress,” Meta CEO Mark Zuckerberg said in a podcast in April. Meta, which developed the large open-source language model Llama, uses a lot of energy and water to power its AI models.

Chip developer Nvidia, which became one of the world’s most valuable companies this year, has also stepped up its efforts to increase energy efficiency. The next-generation AI chip, Blackwell, introduced in March, is said to be twice as fast as its predecessor Hopper and significantly more energy efficient.

Despite these advances, Nvidia CEO Jensen Huang said that investing significant energy in AI development is a long-term play that will pay off as AI becomes smarter.

“The goal of AI is not training. The goal of AI is inference,” Huang said at a talk at the Hong Kong University of Science and Technology last week, referring to the way an AI model applies its knowledge to draw conclusions from new data pull .

“Inference is incredibly efficient and can discover new ways to store carbon dioxide in reservoirs. Maybe she could discover new wind turbine designs, maybe she could discover new materials for storing electricity, maybe more effective materials for solar panels. “We should use AI in so many different areas to save energy,” he said.

Transition to nuclear energy

Many technology leaders argue that the need for energy solutions is urgent and investment in nuclear energy is necessary.

“There is no way to get there without a breakthrough,” OpenAI CEO Sam Altman said at the World Economic Forum in Davos in January.

Altman was particularly interested in nuclear energy. He invested $375 million in nuclear fusion company Helion Energy and has a 2.6% stake in Oklo, which develops modular nuclear fission reactors.

The dynamics of nuclear energy also depends on government support. President Joe Biden has been a supporter of nuclear energy, and his administration announced in October that it would invest $900 million to fund next-generation nuclear technologies.

Clean energy investors say government support is key to advancing a national nuclear agenda.

“The growing demand for AI, particularly at the inference level, will dramatically change electricity consumption in the US,” Cameron Porter, general partner at venture capital firm Steel Atlas and investor in nuclear energy company Transmutex, told Business Insider by email. “However, it will only advance net zero targets if we can resolve two key regulatory bottlenecks – faster nuclear licensing and access to grid connections – and address the two biggest challenges facing nuclear energy: high-level radioactive waste and fuel sourcing.”

Porter expects the new Trump administration to take steps to advance the cause.

“Despite these challenges, we expect the regulatory issues to be resolved because, ultimately, AI is a matter of national security,” he wrote.

AI’s energy consumption is growing

Technology companies are looking for new energy solutions because their AI models consume a lot of energy. ChatGPT, built on OpenAI’s GPT-4, uses more than 17,000 times the electricity of an average U.S. household to answer hundreds of millions of queries daily.

By 2030, data centers – which support the training and deployment of these AI models – will account for 11-12% of US electricity demand, up from 3-4% currently, a McKinsey report says.

Technology companies have turned to fossil fuels to meet short-term needs, leading to a rise in greenhouse gas emissions. For example, Google’s greenhouse gas emissions rose 48% between 2019 and 2023, “primarily due to increases in data center energy consumption and supply chain emissions,” the company said in its 2024 sustainability report.

Leave a Reply

Your email address will not be published. Required fields are marked *