Perspective: Addressing the climate blind spot in AI literacy
Artificial intelligence is reshaping how students learn, but are we teaching it responsibly? That question took center stage at SXSW EDU this spring during SubjectToClimate’s session, AI & Climate: Learning Amid Tradeoffs & Possibilities. I joined panelists from Penn Graduate School of Education, KIPP Public Schools, and SubjectToClimate to make a clear case: if we teach AI literacy, climate literacy must come with it.
The true cost of AI infrastructure
Most of us know that generative AI models, like ChatGPT and Claude, rely on vast data centers, yet their environmental impacts remain easy to ignore. Like many climate-related burdens, the costs are unevenly felt around the world. I challenged the audience to think about their own perceptions of AI’s tradeoffs, suggesting that many of us are happy to think of AI as something that exists in “the cloud.” Expanding on the climate and environmental impacts of AI, I discussed issues of data transparency from tech companies, water and electricity consumption, material use and the siting of data centers around the United States and the world.
If students become fluent in AI without also learning its social and environmental costs, we risk repeating a familiar pattern — raising a generation fluent in use but unprepared to manage the consequences and everchanging resource demands of the technology that our world is built around. Before opening computers, educators can ask students critical questions such as: what exactly is AI, where does it come from, and what kinds of resources does it need to function?
Defining AI literacy
If a goal of education is to prepare students for the world they’re inheriting, then it must include understanding how past technologies reshaped society. Margaret Wang-Aghania, co-founder and executive director of SubjectToClimate, said, “the lesson of the Industrial Revolution isn’t that transformative technology is harmful, it’s that human priorities determine what technology becomes.” Students who understand AI’s resource footprint, evaluate its tradeoffs, and how to ask what a system is optimizing for and whose needs it serves, are better prepared for the workforce and to one day shape these systems rather than simply operate them.
Frameworks like TeachAI’s are pushing in the right direction, including competencies like the ability to describe how AI systems consume energy and natural resources, evaluate environmental and social tradeoffs, and design AI solutions that serve community needs. These represent the difference between literacy and fluency; literacy means you can use the tool, fluency means you understand what you’re using and why it matters.
What learning science shows
There is also a pedagogical tension between how people learn best and how AI tools are designed to function. Nam Nguyen, senior research manager in artificial intelligence & education at Penn GSE, shared that learning science shows friction often leads to deeper understanding. AI, by contrast, is optimized to remove friction and generate an answer, not to create the conditions for genuine wrestling with a problem. Nam challenged the “more is better” mindset dominating educational technology: the assumption that more content, more data, and more devices automatically improve learning. In practice, he says, it often drives wasteful production of new technology while delivering only marginal educational gains.
Challenging existing pedagogy, Nguyen pointed to the AIED Unplugged model where students can achieve deep, impactful learning without digital hardware. The core finding of this research is clear: we must refocus on human-led learning experiences rather than pouring resources into new devices based on unsupported evidence for what constitutes effective learning.
Even leading advocates of educational technology have acknowledged this limit. Reflecting on Khanmigo, Sal Khan recently noted that “AI is going to help... But I think our biggest lever is really investing in the human systems.”
Getting it right
Some schools are taking a holistic approach to integrating AI literacy. At MIT Open Learning, I am developing Universal Climate and Energy, a new modular online offering that will include content from MIT experts on AI's energy and resource demands as well as ways to use AI for climate action. Additionally, work through the Day of Climate initiative is exploring how data visualization tools powered by AI can help students develop systems thinking, while also flagging that large language models tend to reproduce the framing you bring to them. A doomer framing gets amplified. A dismissive one does too.
SubjectToClimate is now offering an ELA Unit on Generative AI, exploring its pros and cons, energy consumption, and responsible use, along with a host of teaching resources.
KIPP NorCal’s 2026 AI guidelines include a section on environmental impact, sitting alongside privacy and equity as a core responsibility rather than a footnote.
The session's collective call to action was to prioritize what climate and AI skills are considered essential, rather than adding new content to an already packed curriculum. AI literacy and climate literacy aren't separate subjects competing for space, and student learning deepens when we teach them that way. Teacher training, curriculum design, and professional support are the infrastructure that make both possible.
For educators and school leaders looking to go deeper, explore the session's full takeaways and resources.
Christopher Rabe is the Universal Climate and Energy Program Lead at MIT Open Learning and the Education Program Director and Program Lead at MIT Climate Project.