Dark side of AI: a question to ChatGPT consumes like a light bulb turned on for a minute

THE’artificial intelligence It’s a bit like that brilliant but messy roommate: it helps you, but in the meantime leaves behind a mess that’s hard to ignore. On the one hand, it represents a precious ally in the fight against climate change, capable of optimizing electricity networks, reducing waste and making industrial processes more efficient. On the other hand, however, its hunger for energy is growing at rates that alarm sustainability experts.

According to the International Energy Agency (IEA), global electricity consumption of data centers it almost could double in four yearsgoing from 460 TWh in 2022 to over 1,000 TWh by 2026. An increase driven precisely by the expansion of artificial intelligence and the increasingly high number of requests processed by machine learning systems and generative models.

Every query sent to a system like GPT or Gemini has a real energy cost: up to 0.43 Wh per single demandthe equivalent of an LED light bulb being turned on for almost a minute. For longer and more complex questions, you can get to 2–4 Wh.

Sobering data: even the digital world, often perceived as “immaterial”, is based on physical infrastructures that consume enormous quantities of electricity and water for cooling.

Data centers and AI: in Italy the demand for electricity has grown by 50% in 4 years

If we look at the Italian situation, the numbers are no longer comforting.
Second Confartigianatobetween 2019 and 2023 the electricity demand linked to IT services and data centers has increased by 50%with a 144% jump in the consumption of data processing infrastructures.
In 2023 alone, total consumption has reached 509.7 GWh of electricitywith Lombardy, Lazio, Emilia-Romagna and Piedmont which alone represent 85% of the total.

In parallel, the indirect emissions (Scope 3) of big tech companies have exploded: between 2020 and 2023, Microsoft, Amazon and Meta recorded an average increase of 150%.
Google saw +48% compared to 2019, while Microsoft saw +29% considering the entire aggregate (Scope 1–3).

In short, digital innovation runs faster than the ability to keep it sustainable.

How to make artificial intelligence sustainable?

The key to reducing the environmental impact of artificial intelligence is not in slowing down innovation, but in measure and manage it responsibly. AI-related emissions are now part of the Scope 3 (indirect emissions of digital services) for the companies that use it, and in Scope 1 and 2 for those who manage infrastructures or proprietary models.

For this reason, businesses are called upon to integrate digital carbon accounting systemscapable of precisely measuring the energy footprint of their AI services. Only by knowing “where and how much is emitted” can they be designed effective reduction and compensation strategiesin line with ESG principles.

As he points out Edward BertinHead of Business Development & Growth at ClimateSeed – startup that accompanies companies on decarbonization paths –

AI can become an ally of the green transition only if managed in a transparent and sustainable way. It is essential to measure and monitor the impact of digital systems to ensure that innovation accelerates, not hinders, climate goals.

The real challenge will therefore be to reconcile technology and sustainabilitybuilding a conscious digital transformation, capable of reducing consumption, optimizing resources and keeping faith with climate commitments.

Ultimately, AI isn’t the problem: it’s how we choose to power it.