The rapid increase in AI power consumption has sparked heated discussions. Will it lead to energy shortages?


With the development of artificial intelligence, its rapid increase in electricity consumption is arousing global vigilance. Will there be an “electricity shortage” in the future? Will energy consumption become a “stumbling block” to the development of AI?

  Rising steadily, AI becomes an “electric tiger”?

At present, artificial intelligence technology is developing rapidly and the demand for chips has increased sharply, which in turn has led to a surge in power demand. Public data shows that the power consumption of the global data center market has increased from 10 billion watts ten years ago to 100 billion watts today.

Data map: Data center.Photo by China News Service reporter Li Jun

According to the Global Times citing the New Yorker magazine, OpenAI’s ChatGPT chatbot consumes more than 500,000 kilowatt-hours of electricity every day to handle approximately 200 million user requests, which is equivalent to more than 17,000 times the daily electricity consumption of American households.

Recently, Tesla founder Musk publicly stated that the artificial intelligence industry will change from a “silicon shortage” to a “power shortage” in the next two years. At the beginning of this year, OpenAI founder Sam Altman also admitted that we do need more energy than we previously thought, which will force us to invest more in technologies that can provide this energy.

According to predictions from the American organization Uptime Institute, the proportion of artificial intelligence business in global data center electricity consumption will increase from 2% to 10% by 2025.

A research report from Guojin Securities pointed out that with the iteration of the model, the expansion of the number of parameters, and the expansion of the number of daily active users, the demand for relevant computing power will increase exponentially.

Li Xiuquan, deputy director of the Artificial Intelligence Center of the China Institute of Scientific and Technological Information, said in an interview with Sino-Singapore Finance that in recent years, the scale and number of large artificial intelligence models have grown rapidly, which has also brought about a rapid increase in energy demand. Although problems such as “power shortage” will not occur soon in the short term, the surge in energy demand after the advent of the era of large-scale intelligence in the future cannot be ignored.

  Restrict AI development?Energy issues receive attention

There are various signs that AI’s power consumption may be beyond imagination, which is directly related to whether the AI ​​industry can develop smoothly. Li Xiuquan believes that the wave of intelligent technological revolution is unstoppable, and energy issues are a key issue that must be addressed simultaneously.

This issue has attracted more and more attention. According to AFP, compared with traditional computers, artificial intelligence computing has high energy consumption and low energy efficiency, and has been criticized for this. At the recent NVIDIA GTC conference, the issue of energy consumption was also mentioned.

Data map.Photo by Chen Zeng?

The Boston Consulting Group has released a report stating that by the end of 2030, electricity consumption in U.S. data centers alone is expected to be three times that of 2022. This increase is mainly driven by two key demand factors: AI model training and serving more frequent AI queries.

The “Green Computing Power Technology Innovation Research Report (2024)” released by the China Academy of Information and Communications Technology on March 15 pointed out that the total scale of my country’s computing power has grown at an average annual rate of nearly 30% in the past five years. With the rapid growth of the overall scale of my country’s computing industry, the overall energy consumption and carbon emissions of computing infrastructure represented by data centers have become more and more prominent, and policies have begun to focus on green energy use.

  how to respond?Improving energy efficiency ratio may be the key

The development of AI is inseparable from computing power. As an AI “arms dealer”, Nvidia is already considering the issue of energy consumption. Recently, it released a new generation of AI chips that are said to consume less energy than the previous generation.

Nvidia CEO Jensen Huang gave an example. Using 8,000 of its first-generation AI chips to train the chatbot ChatGPT for three months would consume 15 megawatts of energy. However, using a new generation of chips to perform the same task within the same period of time would only require 2,000 chips. then reduced to 4 MW.

In addition, according to Japanese media reports, Nvidia plans to purchase high-bandwidth memory (HBM) chips from Samsung, which is a key component of artificial intelligence processors and is currently being tested.

“HBM is a technological marvel that can improve energy efficiency and help the world become more sustainable as power-hungry AI chips become more common,” Huang said.

It is worth mentioning that there are currently many data centers in China that also take energy conservation into consideration. For example, a submarine data center uses the ocean as a natural cooling source, placing servers in containers on the seabed and cooling them naturally through the flow of seawater.

Li Xiuquan said that we do not need to panic too much about energy demand issues, but we must respond simultaneously. In the future, while chip computing power is rapidly upgraded, the energy consumption per unit of computing power will continue to decrease. At the same time, liquid cooling technology and optical interconnection technology will further improve the energy efficiency ratio of artificial intelligence computing clusters; large models can be quantified and compressed for specific problems and retrained into dedicated models. Many tasks no longer require the use of large-scale, high-energy-consuming models.


Source link