Katie Wood Freediver, Writer, Explorer
Share this on

Artificial Intelligence (AI) is everywhere. From voice assistants to self-driving cars, AI is revolutionizing industries at an unprecedented pace. But as the technology advances, a darker reality is becoming impossible to ignore: AI’s insatiable energy appetite. Training and running AI models require vast amounts of electricity, and with AI’s expansion, concerns about its environmental footprint are mounting.

So, is this technological progress worth the energy cost? Who is responsible for AI’s massive energy consumption? Do we really need models that require the computational power of a small city? And most importantly, what are corporations doing to address this growing issue? Let’s dive into the numbers and implications of AI’s energy consumption and whether it’s a price we can afford to pay.




Breaking Down the Numbers: AI’s Energy Demands





The training of large AI models is an energy-intensive process requiring high-performance computing infrastructure. To put things into perspective, GPT-3’s training process consumed approximately 1,287 MWh of electricity, emitting about 552 metric tons of CO₂. Google’s AI data centers consume around 15% of the company’s total energy use. Given that Google consumed 15.7 terawatt-hours (TWh) in 2021, that’s more electricity than some countries use in a year. The training of BERT, a widely used AI language model, consumed as much energy as an average U.S. household would in 50 days. The total energy consumption of global data centers, which includes AI and other computing needs, was estimated at 200 TWh in 2022, roughly 1% of global electricity demand. These numbers highlight the growing burden AI places on the planet’s energy resources.







The Justification Debate: Do AI Benefits Outweigh the Costs?





AI has brought about groundbreaking advancements. From improving medical diagnostics to optimizing logistics and supply chains, AI is solving problems in ways humans never could. But is it justifiable to burn through so much energy to train these models? AI-driven smart grids improve electricity distribution and reduce energy waste. AI-powered logistics help companies optimize routes, leading to less fuel consumption and lower emissions. Medical AI systems can process massive amounts of data to detect diseases earlier, potentially saving lives and reducing the need for energy-intensive treatments. However, AI is also being used for less critical purposes. AI-generated art, chatbot conversations that may never be read, and countless social media filters all contribute to the problem. How much of this energy consumption is essential, and how much of it is just excess?







Big Tech’s Role in AI’s Energy Consumption





The biggest players in AI are also the biggest contributors to its energy consumption. Tech giants like Google, Microsoft, OpenAI, Amazon, and Meta are at the forefront of AI development, and their operations are anything but energy-efficient. Google runs vast data centers worldwide, and machine learning accounts for a significant chunk of its total energy use. Microsoft has poured billions into AI research and development, running intensive AI training workloads across its global network of Azure data centers. Amazon powers its AI-driven recommendations, Alexa voice assistant, and cloud AI services through its energy-hungry AWS infrastructure. Meta relies heavily on AI for content moderation, feed personalization, and advertising algorithms, requiring thousands of servers working around the clock. With these companies leading AI innovation, they also bear the biggest responsibility for addressing its environmental impact. But are they doing enough?







Corporate Solutions: Can AI Become More Energy-Efficient?





Corporations are aware of AI’s energy problem and are implementing measures to mitigate its environmental impact. Many tech giants are investing in renewable energy to power their data centers. Google has pledged to run on carbon-free energy by 2030. Microsoft has also committed to becoming carbon-negative by the same year. Researchers are working on ways to train AI models with less energy. Techniques like model pruning, quantization, and distillation allow AI models to operate with fewer computations while maintaining performance. Companies are developing specialized chips optimized for AI workloads. Google’s Tensor Processing Units (TPUs) and Apple’s Neural Engine are designed to handle AI tasks more efficiently than general-purpose GPUs. Cooling accounts for a significant portion of a data center’s energy consumption. Innovations such as liquid cooling and underwater data centers are being explored to reduce power needs. Ironically, AI is being used to optimize AI itself. Meta and DeepMind have experimented with AI systems that automatically fine-tune other models to make them more energy-efficient.







Are Companies Doing Enough, or Just Greenwashing?





While these efforts are commendable, the reality is that AI’s energy consumption is still increasing. As AI models grow in complexity and demand skyrockets, efficiency improvements may not be able to offset the growing appetite for computing power. Are companies actually reducing their energy use, or are they just making it appear more sustainable? AI continues to evolve, raising the ethical question of whether it is being used for the right reasons. There is a difference between developing AI to diagnose cancer and using AI to generate funny cat pictures. Yet, both processes require massive computational resources. Do we need AI models capable of generating endless streams of synthetic content, flooding the internet with text, images, and videos? Is AI’s primary function becoming more about profit than solving real-world problems? Should there be regulations to limit energy-hungry AI applications that provide little societal value?







The Future of AI: Energy Crisis or Sustainable Revolution?





AI isn’t going anywhere—it will only grow in capability and influence. But how it grows is up to us. If we don’t actively work toward sustainable AI, we risk creating a system where innovation comes at the cost of environmental destruction. Governments, researchers, and corporations need to collaborate on policies that ensure AI development remains sustainable. Investments in low-power AI, alternative computing architectures, and responsible AI deployment will be critical in ensuring that AI’s benefits outweigh its costs.

Share this on