Page 31 - The Annual AI Governance Report 2025 Steering the Future of AI
P. 31

The Annual AI Governance Report 2025: Steering the Future of AI



                  the energy used each time someone interacts with the model, such as tying a request into a
                  chatbot and receiving a response, i.e., the cost of using models and data centers. Additional
                  energy considerations extend beyond AI model use to include the raw materials needed for
                  development of chips and data centers, and water and energy needs for production. Growth in
                  AI computation is already driving significant power demands, and we have seen mobilization to
                  produce more energy across the AI industry, with leading labs investing significantly in nuclear
                  power.

                  In the United States, AI data center power demand grew tenfold over the last three years—from
                  0.4 gigawatts (GW) in 2020 to 4.3 GW in 2023 (Patel, Nishball, and Ontiveros, 2024). In 2025,
                  total AI data center demand will likely reach about 21 GW of total power capacity, more than
                  a fourfold increase from 2023 and twice the total power capacity of the state of Utah.  From
                                                                                               112
                  RAND’s research, Global AI data centers may need 68 gigawatts (GW) of power by 2027 and
                  up to 327 GW by 2030, driven by continued exponential growth in AI chip supply and training
                  demands—comparable to the total power capacity of major U.S. states like California.

                  Furthermore, looking beyond direct usage. chip manufacturing relies on scarce minerals like
                  cobalt and tungsten, and construction of AI infrastructure involves carbon-intensive materials like
                  concrete.  The production of GPUs – 3.85 million units shipped to data centers in 2023 alone –
                          113
                  also contributes indirect emissions through complex manufacturing and material extraction. 114

                  Training runs require significant energy consumption, and are difficult to satisfy, because they
                  require a large amount of power capacity at a single location. Currently, training runs represent
                  a small share of overall energy use; however, if scaling laws persist, their energy use impacts will
                  be more significant. Compute scaling has been consistent for over a decade, and hyperscalers
                  like OpenAI have announced plans to continue development and grow compute. Even if they
                  are one-off events, training runs could demand up to 1 GW in a single location by 2028 and
                  require up to 8 GW of power by 2030, equivalent to eight nuclear reactors, assuming current
                  scaling trends persist.  In order to develop the capacity for large scale single location use of
                                      115
                  power, significant challenges including inadequate transmission, insufficient power, and supply
                  gain delays would need to be overcome.

                  Although training AI models requires significant energy, the greater demand arises from
                  inference, when hundreds of millions of people interact with these chatbots daily. On current
                  inference energy usage, OpenAI recently released a figure that the average query uses about
                  0.34 watt hours (Wh), “about what an oven would use in a little over one second.”  Marcel
                                                                                             116
                  Salathé (EPFL) estimates that, assuming that a typical chatbot interaction consumes an energy
                  of about 0.2 Wh and that an average user has 100 interactions (10 chats, each with 10 back-and-
                  forth messages) every single day in a year, this would add to an annual energy consumption per





                  112   Pilz, K. F., Mahmood, Y., Heim, L., & RAND Corporation. (2025).  AI’s Power Requirements Under
                     Exponential Growth: Extrapolating AI Data Center Power Demand and Assessing Its Potential Impact on
                     U.S. Competitiveness. RAND Corporation.
                  113   Luccioni, S., Trevelin, B., Mitchell, M. (2024, September 3). The Environmental Impacts of AI -- Primer. Hugging
                     Face.
                  114   Bashir, N., Donti, P., Cuff, J., Sroka, S., Ilic, M., Sze, V., Delimitrou, C., & Olivetti, E. (2024, March 27). The
                     climate and sustainability implications of Generative AI. An MIT Exploration of Generative AI.
                  115   Pilz, K. F., Mahmood, Y., Heim, L., & RAND Corporation. (2025). AI’s Power Requirements Under
                     Exponential Growth: Extrapolating AI Data Center Power Demand and Assessing Its Potential Impact on
                     U.S. Competitiveness. RAND Corporation.
                  116   Altman, S., (2025). The Gentle Singularity.



                                                           22
   26   27   28   29   30   31   32   33   34   35   36