When Google announced a 48 percent rise in its climate emissions since 2019, the tech giant pointed to artificial intelligence (AI) as the primary culprit. The expansion of AI technology has led to an increase in the construction of data centers by US tech firms, raising concerns about the significant energy consumption and environmental impact of AI.
How AI Uses Electricity
AI operations rely heavily on data centers. Every request to a chatbot or generative AI tool is processed in these centers, consuming large amounts of electricity. Developing AI programs, especially large language models (LLMs), requires substantial computing power. This continuous operation heats the servers, necessitating additional energy for cooling. According to the International Energy Agency (IEA), data centers use roughly 40 percent of their electricity for computing and another 40 percent for cooling.
Concerns of Experts
The launch of OpenAI's ChatGPT in late 2022 triggered a rush among tech firms to integrate AI into their products, leading to concerns about a surge in electricity usage. AI services typically require more power than their non-AI counterparts. For instance, studies indicate that a single request to ChatGPT uses about 10 times the power of a Google search. If Google were to transition all search queries to AI, it could dramatically increase its electricity consumption.
Training LLMs is particularly energy-intensive, necessitating high-powered computer chips that, in turn, require more cooling and thus more electricity.
Energy Consumption of AI
Before the widespread adoption of AI, data centers were estimated to account for around one percent of global electricity demand. In 2022, data centers, along with cryptocurrencies and AI, consumed 460 TWh of electricity, nearly two percent of global demand. The IEA predicts this figure could double by 2026, equating to Japan's electricity consumption.
Research by Alex De Vries, focusing on sales projections from NVIDIA (a leader in AI-specialized servers), estimated that NVIDIA's servers alone could consume between 85.4-134.0 TWh annually, comparable to the electricity usage of Argentina or Sweden. These figures are likely conservative, as they did not fully account for cooling requirements.
Coping with AI Demands
Data centers are evolving to handle the increased demands of AI. For example, Digital Realty, a data center company, is adapting its facilities to accommodate the higher power and cooling needs of AI servers. Unlike standard computing requests that rely on air-conditioned rooms, AI servers require more powerful components and water cooling.
Sustainability Efforts
Major AI and data center operators like Amazon, Google, and Microsoft are striving to reduce their carbon footprints by investing in renewable energy. Amazon claims to be the largest purchaser of renewable energy globally and aims to be net-zero carbon by 2040. Google and Microsoft have set a goal of reaching net-zero carbon by 2030. However, the construction of new data centers and increased usage of existing ones complicate these green energy targets.
Recent reports indicate rising greenhouse gas emissions for both Google and Microsoft, with Google citing a 48 percent increase since 2019 and Microsoft a 30 percent rise since 2020. Both companies attribute these increases primarily to AI.
Microsoft President Brad Smith acknowledged the challenge, describing the AI sustainability pledge as a "moonshot" goal made before the recent AI boom, and likening the increased distance to the moon to the unexpected rapid growth of AI.
More: https://techxplore.com/news/2024-07-ai-major-world-energy.html
