Recently artificial intelligence (AI) has shown very rapid growth. While ethical inadequacies are being discussed, there is also an energy dimension to this discussion. AI’s growth requires a lot of energy, and our world may not be ready. As AI technologies continue to develop and expand, so does their energy consumption. Understanding AI’s energy needs is crucial, especially in the context of global efforts to reduce carbon emissions and combat climate change.
The production and consumption of electricity worldwide, especially fossil fuel-based power generation, is a serious burden on our planet. Carbon emissions are the main driver of climate change, increasing the frequency and severity of natural disasters and irreversibly altering ecosystems. This makes it almost impossible for companies and countries to reduce carbon emissions to zero between 2030 and 2050. On top of that, the rise of AI is creating a new paradigm in energy consumption.
Artificial intelligence overconsume energy
A new white paper published by the Electric Power Research Institute (EPRI) reveals AI’s potential for exponential growth in power requirements. The 35-page report, “Powering Intelligence: An Analysis of AI and Data Center Energy Consumption,” predicts that total energy consumed by US data centers could more than double to 166% by 2030.
A striking example of this increased demand is the energy consumption of AI-based queries. The EPRI report estimates that a single query from ChatGPT, a popular AI language model, consumes about 29 watt-hours (Wh) of electricity. This is approximately ten times more than a traditional Google search query, which uses about 3 watt-hours per query. With AI applications involving image, audio, and video rendering, the power requirements are much higher, turning it into a coal-fired locomotive.
According to the International Energy Agency (IEA), data centers currently consume about 1% of global electricity, which is expected to increase with the proliferation of artificial intelligence technologies rapidly. Accelerating in 2019, AI and data centers will account for 1% of the world’s energy needs over a 5-year period, which is frankly scary.
There is an uneven geographic distribution of data center energy consumption in the United States. By 2030, data centers in states such as Virginia are projected to account for up to 46% of the national data center load. Other states, such as Oregon, Iowa, Nebraska, North Dakota, and Nevada, are also expected to see larger increases in energy consumption as they create new data centers. This concentration of energy demand poses unique challenges and requires a strategic approach to energy management.
The impact on global energy resources
The impact of artificial intelligence on energy consumption extends beyond the borders of the United States. Large technology companies like Microsoft, Nvidia, and Google are expanding their data centers and manufacturing facilities worldwide. This global expansion increases demand for energy resources, including electricity and raw materials such as copper. Growing demand for chips and other components needed for AI technologies has also contributed to global chip shortages, exacerbating supply chain problems. This increased production also exposes the untruthfulness of companies like Microsoft that claim to reduce carbon emissions.
AI’s growing energy needs are increasing overall energy consumption and contributing to environmental challenges. Higher energy consumption leads to increased carbon emissions, which accelerate climate change. If AI’s energy consumption continues unchecked, it could exacerbate the effects of global warming, making it more challenging to mitigate climate change impacts.
The intersection of AI and climate change
Climate change is much more impactful through decisions that companies and governments can take rather than personal measures. The scientific community warns that a global temperature rise exceeding 1.5°C could lead to irreversible climate impacts. The energy-intensive nature of AI technologies poses a risk to global efforts to keep the temperature rise within this limit.
Recent studies have started to highlight the environmental impact of AI. For instance, researchers from the University of Massachusetts Amherst found that training large natural language processing (NLP) models can generate carbon emissions equivalent to those produced by five cars over their entire lifetimes. This finding underscores the importance of addressing AI’s energy consumption to prevent further exacerbation of climate change.
Data center energy efficiency and renewable energy
Improving data center energy efficiency and increasing the use of renewable energy sources can be critical steps to manage the growing energy demands of AI. Or, measures may need to be taken before it is too late. Data centers should adopt more efficient cooling and power management systems to reduce their overall energy consumption. Furthermore, investing in renewable energy sources such as solar and wind can help offset the carbon footprint of AI technologies.
Many tech companies are announcing leaps toward clean energy but are building factories much faster. Google, for example, has committed to running its data centers on carbon-free energy 24/7 by 2030. It has also built new data centers for Gemini in many different locations. Similarly, Microsoft aims to become carbon-negative by 2030, meaning it will emit far less carbon than it emits into the environment. It may be possible to achieve this under Microsoft’s name, but the AI energy requirements of OpenAI, the company under which it grew, are endless.
The future of AI energy consumption
Forecasting AI’s future energy consumption involves considering various growth scenarios. EPRI has developed multiple projections for potential electricity usage by U.S. data centers from 2023 to 2030. These scenarios range from a low growth rate of 3% per year to a high growth rate of 15% per year. Even in the low-growth scenario, data center energy consumption is expected to increase by 29%, reaching 1963 TWh/year by 2030.
In the high-growth scenario, consumption could soar to 4039 TWh/year, a 166% increase from current levels. In short, these values are enough to change the world seriously, and we can move from a post-apocalyptic world created by AI to a post-apocalyptic world from climate.
The concentration of data center growth in specific regions could also lead to localized energy challenges. States like Virginia, which already bear a significant portion of the data center load, may face increased pressure on their energy infrastructure. Ensuring a balanced and sustainable energy supply will require coordinated efforts from policymakers, energy providers, and the technology industry.
AI’s hunger for energy is a growing concern that demands attention and action. As AI technologies continue to evolve and become more integrated into various aspects of life, their energy consumption will continue to rise. Addressing this issue requires a multifaceted approach, including improving data center energy efficiency, investing in renewable energy, and developing long-term strategies for managing AI’s energy demands.
Understanding and mitigating AI’s energy consumption is crucial for ensuring that the benefits of AI technologies do not come at the expense of environmental sustainability. As the world grapples with the challenges of climate change, balancing AI’s energy needs with efforts to reduce carbon emissions will be key to achieving a sustainable future.
Featured image credit: Markus Spiske / Unsplash