Friday, April 24 2026 11:22
Alina Hovhannisyan

Expert: Every time we scale AI, we increase energy consumption and  worsen environmental impacts

Expert: Every time we scale AI, we increase energy consumption and  worsen environmental impacts

ArmInfo. Every time we scale AI, we increase energy consumption and worsen environmental impacts. This was stated by Olena Bura, Managing Partner and CEO of the Swiss consulting company Enzentra GmbH (Switzerland), during a panel  session on the topic: "Application of modern digital solutions and  artificial intelligence technologies for environmental issues within  the framework of RES 2026."

She noted that training AI models requires approximately 12 tons of  gas emissions, which is equivalent to approximately 15 round-trip  transatlantic flights. "And this is just the training phase," the CEO  of Enzentra GmbH emphasized.

According to the expert, no company can yet answer the question of  how much AI costs and what value is created per unit of emissions.  Many organizations have already developed responsible approaches to  using AI and are building the corresponding infrastructure. They are  also addressing risk management, data protection, and security  issues.

However, according to Olena Bura, this list of measures is not  exhaustive and does not fully answer the question of what resources  and in what quantities are required for AI to operate.

She identified three main sources of global pressure that must be  considered.  First, Bura pointed to the lack of infrastructure, which  is becoming a strategic bottleneck. Related to this is pressure on  capital allocation, high infrastructure costs, and reputational  risks.

"Clients and investors are already aware of the environmental  consequences of AI. Organizations that ignore this factor will  inevitably face risks of infrastructure instability, as well as  environmental and economic shocks. Long-term sustainability requires  recognizing that infinite scaling is impossible and has  consequences," she stated. To move from simply responsible use of AI  to environmentally responsible use, the expert said, it's necessary  to think not only about maximizing model accuracy but also about  efficiency.  Energy consumption must be considered and optimized.

Furthermore, Bura noted, it's important to evaluate not only the  economic return on investment, but also the environmental one. "We  need to move from static management to a more 'metabolic'  one-flexible and responsive to constant change. It's important to  understand the costs of creating and operating AI infrastructure.

Furthermore, scaling shouldn't be blind-it needs to be done  intelligently. Not every model improvement requires retraining, so  it's important to distinguish between what's truly necessary and what  can be abandoned," she emphasized.

Regarding environmental factors, Bura noted that it's worth  considering existing 'accelerators'-approaches and practices that  help reduce impact. She highlighted several strategies based on  examples from large tech companies such as OpenAI, DeepMind, and  Meta. Their primary focus is increasing capacity and rapid growth,  which is a classic approach. The second model, according to the  expert, is optimization. Companies strive to improve the efficiency  of existing solutions and reduce losses, taking environmental factors  into account, but so far without a major redesign. Bura cited  Microsoft and NVIDIA as examples. "Here, the primary focus is on  efficiency," she noted.

However, she identified the most sustainable model as AI development,  taking environmental constraints into account as an integral part of  the architecture.