Page 104 - AI Standards for Global Impact: From Governance to Action
P. 104

AI Standards for Global Impact: From Governance to Action



































                  Figure 46: Reports announced at the event


                  14�2  Innovations in environmentally efficient AI

                  This session explored technological innovations aimed at reducing AI’s environmental footprint
                  across the hardware-software stack, sharing perspectives on energy-efficient AI models,
                  neuromorphic computing, sustainable data centre operations, and the role of green energy.

                  University College London highlighted the daily impact of energy savings and the importance
                  of optimizing large models, while using small models when appropriate. The importance of
                  designing for efficiency from the outset was emphasized, with scenarios showing that task-
                  specific small models can achieve over 90% energy savings compared to large, multipurpose
                  models. Emerging architectures such as Mixture of Experts, retrieval-augmented generation,
                  neurosymbolic AI, and brain-inspired designs were discussed as promising pathways toward
                  sustainability.

                  Current digital computing architectures pose theoretical and practical limitations for
                  sustainability, leading participants to consider that spiking neural networks and neuromorphic
                  computing could be biologically inspired alternatives that drastically reduce energy use while
                  enhancing reliability.

                  The "Green AI" movement also emphasizes computational efficiency and transparency. While
                  large models like PaLM require massive amounts of computing resources, most environmental
                  impact comes from inference, not from training. The Hebrew University of Jerusalem noted that
                  inference operations account for 80-90% of all AI computation and are run billions of times per
                  day. There is a need for the community to report compute budgets and match model complexity
                  to task difficulty, considering that LLMs are not the solution for every problem.

                  Google’s end-to-end sustainability strategy includes:

                  •    Model optimization (e.g. quantization, pruning, and knowledge distillation)
                  •    Custom hardware (e.g. Ironwood TPU, 30x more efficient than its 2018 predecessor)




                                                           92
   99   100   101   102   103   104   105   106   107   108   109