The Hidden Environmental Cost of Generative AI

Image Courtesy: Wikimedia Commons

Until OpenAI’s launch of ChatGPT in November 2022, generative artificial intelligence seemed foreign and futuristic to most people. Since then, its widespread adaptation has been transformational – individuals and industry alike, have integrated AI mechanisms into general use, further propelling digital industrial technology and ‘Industry 4.0’. However, this rapid change comes with grave environmental consequences, contributing to rising greenhouse gas emissions, skyrocketing energy demands, increased electrical waste and freshwater depletion.

GenAI has been hailed as a game changer for improving efficiency and reducing human error, based on innovation previously thought unachievable. JP Morgan Research estimates it may increase the world’s GDP by up to 10%, yet critical voices point out the real-life trade-offs that such possibilities are based on. MIT’s Climate & Sustainability Consortium warns of the environmental impacts that go hand in hand with training these generative models, for example, AI’s increased need for electricity and computing power which has led to further resource depletion. Sam Altman himself, OpenAI’s CEO and co-founder, has admitted that AI’s energy needs are ‘vastly greater than expected’ and would be unsustainable without a breakthrough.

To understand AI’s carbon footprint, a solid grasp of the mechanism behind it is essential. When we type something into Google’s multimodal chatbot Gemini or use ChatGPT, the output generated within seconds is dependent on the energy and emissions that went into training the underlying algorithm, the manufacturing for the necessary equipment, as well as the energy needed to process our query. Computer scientist Kate Saenko has presented the public with a general rule: the more powerful an AI tool is, the more energy it consumes. However, in terms of total emissions, this rule does not factor in a variety of different indicators. Google research indicates that the carbon footprint associated with training an AI model of the same size can vary drastically – by as much as 100 to 1000 times – depending on the choice of Deep Neural Network algorithm, data centre, and processor type.

 Currently the configuration of every AI model requires thousands of megawatt hours of electricity. GPT3 alone produced more than 500 tons of carbon dioxide, and with the need for constant retraining and development of different specialised models, up to 6% of the entire United States’ electricity consumption may be used for the data centres driving AI by 2026. From now until 2026, AI’s energy needs are set to increase by at least ten times the current level, putting further pressure on our energy grids – that is more than all of Belgium uses in a year. Additionally, cooling AI systems’ processors requires enormous amounts of freshwater. Water use is projected to reach up to 6.6 billion m³ by 2027. While the frequency and severity of droughts increase globally, most of the water used for AI is simply ‘lost’ after evaporating into the atmosphere.

Climate change and resource depletion affect everyone, however, they do so unequally. Vulnerable communities bear a disproportionate burden extending beyond those living near data centres that consume their freshwater or regions left with its electronic waste. Generative AI’s benefits are not equally shared either, as some gain from increased efficiency while others face the brunt of the consequences. AI’s negative environmental effects are also set to amplify social stressors and to further fuel disproportionate negative health and environmental outcomes.

Neither downplaying GenAI’s environmental costs nor waiting for a miraculous technological fix, as Sam Altman or other Silicon Valley voices suggest, are viable options. Instead, sustainability must be embedded directly into GenAI development. By adopting research-backed practices for reduced environmental harm and aligning AI with digital decarbonisation, AI’s environmental burden can be reduced. However, this can only start to happen once stronger regulation and a clearly defined code of ethics is implemented.