Today (Oct 26,2024) I gave a keynote speech at the Green AI Summit 2024. Here are my remarks, and an optimistic rendering of a green future by ChatGPT (vintage 2065).
Good morning, everyone. I'm Paulo Carvão, and it is an honor to be speaking at the Green AI Summit.
Let's face it: AI and its data center infrastructure are energy-hungry. According to the International Energy Agency, in 2022, data centers consumed 2% of the world's electricity, and this will more than double by 2026 with the expansion of cloud computing and AI technologies. To put this into perspective, this is roughly the same Japan, the fourth largest economy in the world, consumes.
The environmental impact of training a large AI model is significant, and estimates vary, given the lack of industry transparency. Studies show that the effort to train GPT-3 used 1,287 megawatt-hours of electricity and produced 502 tons of CO2 emissions, the equivalent to the emissions generated by 112 gas-powered cars for a year or the annual energy consumption of around 120 average American homes. At usage or inference time, a ChatGPT query uses as much as ten times the energy a Google search does. Besides energy, data centers require vast quantities of water for cooling, consuming hundreds of thousands of gallons per day, further straining our resources.
The implications of this demand are significant and are driving large enterprise consumers to rethink their energy-sourcing strategy. As an example, Big Tech is using nuclear energy to power AI data centers for reliable and emissions-free electricity. Microsoft is exploring small modular reactors (SMRs) for its data centers and partnering with an energy company to revive the Three Mile Island nuclear power plant in Pennsylvania, which closed in 2019 after the worst nuclear reactor accident in the history of the United States. Google is incorporating nuclear into its clean energy mix, partnering with startups like Kairos Power. Â AWS is also leveraging SMRs as part of its pledge to net-zero carbon operations by 2040 and is signing agreements with energy utility companies in the states of Washington and Virginia to leverage this technology.
Today, I want to discuss how to address this issue and drive a more sustainable future for AI.
Innovation in three areas will pave the path to Greener AI
Innovation in AI Models: We must design more energy-efficient AI models. This requires exploring new architectures, algorithms, and training techniques that minimize computational demands.
Innovation in Data Centers: Innovations in data center design and operation are essential. This includes:
Optimizing cooling systems.
Adopting more efficient hardware.
Utilizing waste heat.
Transitioning to renewable energy sources.
Location innovation: Strategically locating data centers in regions with abundant clean energy sources - like hydropower, solar, and wind energy - can significantly reduce the carbon footprint of the AI infrastructure. In the United States, the metropolitan Phoenix area in Arizona is emerging as the second largest data center destination in the country given the availability of land and photovoltaic energy.
Beyond LLMs: A New Frontier of Innovation
The rapid development, adoption, and use of AI, particularly large language models, has brought tremendous opportunities and, at the same time, heightened awareness of their limitations and potential risks. It is becoming increasingly clear that we need to go beyond LLMs and explore new frontiers of AI innovation that prioritize efficiency, transparency, and safety.
Several promising research streams are emerging, focusing on developing AI models that are not only powerful but also aligned with human values and operate within safe boundaries while managing their resource footprint. Examples of these innovations include:
Liquid Neural Networks: Unlike traditional LLMs, LNNs can adapt and learn continuously from streaming data, requiring minimal computing resources. They offer greater transparency in decision-making, addressing the "black box" issue often associated with LLMs.
Objective-Driven AIs: These AIs focus on achieving specific goals efficiently, enhancing decision-making accuracy. Architectures like JEPA, introduced by Yann LeCun, enhance predictability and address the inherent challenges with LLMs relative to transparency and adaptability.
Generative Flow Networks (GFlowNets): GFlowNets address the limitations of LLMs by adaptive sampling from complex data distributions. They aim to model data generation probability more effectively, improving the quality of generated samples.
Investing in these and other innovative approaches is critical to developing AI that is both environmentally sustainable and societally beneficial.
The Role of Policy in Driving Sustainable AI
Role of the State: Governments must take the lead by:
Setting ambitious targets for data center energy efficiency and renewable energy use.
Creating incentives for the adoption of green technologies.
Investing in research and development of sustainable AI solutions.
Role of Private Initiative: The private sector must:
Embrace sustainable practices throughout the AI lifecycle, from design and development to deployment and end-of-life management.
Adopt transparent reporting mechanisms for their environmental impact.
Conclusion
The future of AI depends on our commitment to sustainability. By fostering innovation and implementing supportive policies, we can harness the power of AI to create a more equitable and environmentally responsible digital future.
Thank you very much.