Recap: Symposium on AI and Climate Technology During SF Climate Week

By Niki Borghei, Ruhani Chhabra, Kaspar Mossman, and Chrissa Olson.
Climate change is becoming more real with every season. We need to leverage as many tools as possible to solve this global challenge and mitigate the damage in the years to come. Artificial intelligence is a powerful tool that can advance the discovery, translation, and impact of technologies needed to combat climate change at the speed we need.
That’s why this past Wednesday and Thursday (April 23-24, 2025), during SF Climate Week, Bakar Climate Labs partnered with UC Berkeley’s Bakar Institute of Digital Materials for the Planet (BIDMaP) to hold the inaugural “AI and Climate Symposium,” with the important subtitle “From Cutting-Edge Research to Commercialization.” Our goals were to highlight opportunities for AI to advance the discovery and ultimate translation of climate technologies – and to connect researchers, founders, and investors across both AI and climate tech.
“The climate crisis demands unprecedented collaboration between disciplines that have traditionally operated in separate spheres,” said Shilpi Kumar, Director of Partnerships at Bakar Climate Labs. “By bringing together experts, both AI researchers and climate entrepreneurs, we’re supporting a powerful ecosystem where computational advances can enable real-world climate solutions. At the same time, we must consider AI’s energy requirements and environmental footprint, ensuring that the tools we build to address climate change aren’t exacerbating the very problem they aim to solve.”
The event, held at UC Berkeley’s Bakar BioEnginuity Hub at the southeast corner of campus, hosted more than 180 attendees across a day and a half of expert talks, panel sessions, and networking breaks.
“We realized that artificial intelligence is playing a major role in developing all sorts of engineering enterprises,” said Alex Bell, associate director of Bakar Climate Labs and Dow Professor of Sustainable Chemistry at Berkeley. “It became evident that we should ask: how does AI play into climate change technology? Sarah Jones [executive director of BIDMaP] illustrated how Omar Yaghi has used AI to develop MOFs [metal-organic frameworks] to absorb CO2 and water. We saw a lot of opportunity here, so we put the symposium together.”
“We’re focusing not only on the science, but on the translation and commercialization of that science,” Jones added. “By partnering with Bakar Climate Labs, we can bridge that space and engage with a broader community of researchers, founders and investors. This will allow us to take the innovations we are developing, these tools and materials, and take them to the next step to work with partners in industry so we can start to have impact at the kind of scale that we need to address issues around climate change.”
Wednesday: Applications of AI to Materials Science
Traditional methods of studying materials science are often slow and limited, but AI is now reshaping the field. In the “Energy & Carbon Capture” segment, Jeffrey Neaton, a Berkeley professor of physics and associate laboratory director for energy sciences at Berkeley Lab, explained how machine learning is helping researchers overcome the limitations of density functional theory through faster simulations of materials for solar fuels and carbon capture. Maria Chan, a computational materials scientist at Argonne National Laboratory, expanded on this, highlighting how integrating AI with experimental techniques—such as electron microscopy—builds stronger batteries, catalysts, and other technologies. Extending AI into experimental lab work, Gabe Gomes, an assistant professor of chemistry and chemical engineering at Carnegie Mellon, introduced “Co-Scientist,” a platform combining large language models, robotic lab equipment, and automated planning systems. “For the first time, a non-organic intelligence planned, designed, and executed complex scientific experiments…” Gomes said. “That’s just remarkable.” These advances point toward a future where AI increases the pace of climate tech innovation.
According to Omar Yaghi, James and Neeltje Tretter Professor of Chemistry at Berkeley, the science behind carbon capture is solved—it’s now a matter of policy. He and Peidong Yang, professor of chemistry and S. K. and Angela Chan Distinguished Chair in Energy at Berkeley, believe the technology can not only remove carbon from the atmosphere but also convert it into valuable chemicals using water and sunlight, a process Yang’s lab has achieved through artificial photosynthesis. Both researchers use AI to speed discovery. For instance, crystallizing COF-323, a material for trapping gases, once took years; with generative AI, it now takes two weeks. “AI is transforming the way we do chemistry, and we are not looking back at all,” said Yaghi. Shijing Sun, an assistant professor of mechanical engineering at the University of Washington, added that AI could accelerate materials innovation 10–100x, especially in synthesizing energy conversion materials, helping bridge lab to market faster.
Thursday: Foundation Models, Wildfire Management & Further Applications
In a morning session, Gábor Csányi, professor of molecular modeling at the University of Cambridge, described a “foundation model” for atomistic chemistry. Csányi was recently lead author of a paper on this model, with 88 co-authors. “I’m not an expert in any of these applications,” he said, “but the others are.” The model enables a map of where CO2 is stored in MOFs. AI modeling of inorganic chemistry is now effective and efficient, but organic chemistry remains more challenging, he said: better modeling of quantum mechanics is needed.
In the same session, Teresa Head-Gordon, Chancellor’s Professor of Chemistry, Bioengineering, and Chemical & Biomolecular Engineering at Berkeley, gave an overview of large language models, particularly “Llama,” developed by Meta. These models have been extended to imagery; she showed how they could be further extended to chemistry using “SMILES” strings, which encode structure in syntax. “Because of the need for scale, real work is being done in industry,” she said. “We need a large-scale consortium of national labs and universities to compete.” She noted that while Google’s AlphaFold 2 had solved the “grand challenge” of protein folding for static structures, 50% of the proteome has intrinsic disorder. “That’s the next frontier,” she said.
In a panel discussion moderated by Samuel Blau, a research scientist at Berkeley Lab, both Csányi and Head-Gordon expressed concern about capitalism driving AI development for profit. The counterweight, both felt, might be international cooperation similar to that which produced the International Space Station and the Large Hadron Collider – although government moves slowly and AI is advancing with great speed.
In a following session on how AI could be used to manage wildfires, a common theme was that platforms needed to be valuable and usable to a wide range of stakeholders – and, at the same time, if they are to be truly effective, they must be both scaled to national or international levels while local users must be able to customize the outputs.
“We need to get out of our silos to deal with wildfires,” said Newsha Ajami, chief development officer for research at Berkeley Lab’s Earth and Environmental Sciences Area. “Nobody in forestry is there for data, but it’s an enabler required for innovation, scale, and collaboration.” Micah Elias, director of natural capital at Blue Forest (a nonprofit conservation finance organization), said that his company needed to explain to utilities what the benefits of “avoided fires” could be over a time period of several decades. When he approaches a landscape, he thinks about the “capital stack” – how to integrate models that address many intersecting interests, such as water management, biodiversity, economic impact, etc. UCSD research scientist İlkay Altıntaş (also chief data science officer of the San Diego Supercomputer Center), added that we cannot accomplish much without the support of the public and expertise of various sectors. Partnerships are important; profit models need to be negotiated. “How do we make sure everyone wins?” she asked.
Translating & Commercializing AI for Climate Tech
Despite growing urgency around climate change, developing reliable geothermal energy has remained slow due to technological and exploration challenges. At the afternoon fireside chat, Carl Hoiland, co-founder and CEO of Zanskar Geothermal & Minerals, and Anku Madan, principal at the VC firm Obvious Ventures, discussed how new tools—such as advanced subsurface modeling, better drilling techniques, and Bayesian AI methods—are helping to unlock geothermal’s potential. They emphasized that venture capital is increasingly willing to support hard tech solutions that require more time and patience. Looking ahead, Hoiland noted that innovation cycles will outlast political ones, and building mission-driven, technical teams will be integral for clean energy breakthroughs. The session ended interactively, with the audience voting on striking—yet potential—future AI scenarios such as “humanoid robots in the home in five years.”
Several innovative climate tech startups demonstrated how AI is transforming the field. Windborne is developing faster, more accurate weather forecasts using machine learning—an improvement over traditional supercomputer-based models. FortyGuard delivers real-time, hyper-local temperature insights with two-meter precision, helping communities adapt to extreme heat. “Half of the solution is usually just understanding the problem,” says founder Jay Sadiq. ChargeMate uses AI chatbots to resolve EV charging issues, addressing a $22B problem that aims to encourage widespread EV adoption. Streamline Climate connects startups with funding by using AI to navigate complex state and federal opportunities.
AI can also be used to analyze the components and design of a building for its sustainability in both design and practice. In a combined keynote and fireside chat with Joe Speicher, Chief Sustainability Officer at Autodesk, and Maryanne McCormick, Executive Director of the Blum Center, Speicher gave a new perspective on sustainability as a “data problem.” “At its core, [sustainability] is: What’s your energy usage? What’s your carbon emissions coefficient? How are you managing energy efficiency inputs to a building? It’s data,” he says. With 1 in 5 people working in construction and manufacturing, the entire field stands to be radically altered from AI advancements.
Academia is changing too – McCormick noted that classes that prepare college students leverage AI are in high demand. In addition, at the Blum Center’s social entrepreneurship competition, Big Ideas at Berkeley, over 50% of the ideas had AI components to address some of the most pressing issues of climate, health, and education.
“Graduate students come to these labs to be trained to be world class chemists, or material science engineers … but at the same time, they’re becoming AI software engineers,” she said. “We see it all across campus as we train lawyers, optometrists, and economists.”
We thank all the speakers for their contributions to this invaluable forum, as well as those on our two organizing committees. On the scientific side: Alexis Bell, Christian Borgs, Bingqing Cheng, Sarah Jones, Mary Anne Piette, and Nakul Rampal. On the startup and industry side: Alexis Bell, Sarah Jones, Jasbir Kindra, Katarina Klett, Shilpi Kumar, Wojciech Osowiecki, and Tenika Versey Walker.
To keep up with our news and events, subscribe to the Bakar Climate Labs newsletter.
This story also appears on the Bakar Climate Labs website.