The Energy Behind the Cloud: The Story of Data Center Power

Cloud
Zoran Gacovski

Zoran Gacovski · Jun 24, 2025 · 7 minute read

Digital services are now indispensable to our daily lives. From streaming and mobile apps to e-commerce and AI, they deliver entertainment, communication, commerce, and productivity on demand. Accessible anytime, anywhere, these technologies drive unparalleled convenience, efficiency, and global connectivity, making them vital tools for individuals, businesses, and economies alike.

But what powers it all? Reliable data center power is paramount for continuous operation. Downtime means costly service disruption, financial losses, and data corruption. To guarantee uptime and sustain high-end computing, cloud providers like Kamatera supply robust data center power infrastructure.

Data centers worldwide consumed approximately 460 terawatt-hours (TWh) of electricity in 2022, accounting for about 2% of global electricity demand. With the rise of AI and cloud computing, global data center energy use is projected to double by 2026, potentially reaching over 1,000 TWh annually. This exponential growth means data centers could soon consume more electricity than entire countries like Germany or Japan, making energy efficiency and sustainable power sourcing critical priorities for the data center industry.

What’s driving data center demand?

Data centers are highly energy-intensive facilities, and several key components drive their power consumption. The primary consumers of electricity are the data center’s cloud servers themselves, which perform data processing (CPUs), storage (hard drives), and transmission functions. Processors operate continuously, and are basically expensive space heaters.

Cooling systems rank as the second-largest power drain, consuming massive amounts of electricity through air conditioning units, chillers, and liquid cooling technologies that prevent catastrophic overheating. These systems run 24/7 to maintain precise temperatures, often consuming 30-40% of a data center’s total energy budget and representing millions in annual operating costs.

Lighting, although a smaller portion of the overall power usage of data centers, is still necessary for operational visibility and safety. Energy-efficient lighting systems like LED are commonly used by data center operators, but their power consumption becomes more significant in large facilities.

Security systems also draw continuous power usage. These include surveillance cameras, biometric access controls, alarm systems, and monitoring stations that ensure the safety and integrity of the data center’s infrastructure.

Together, these systems (servers, cooling, lighting, and security) account for the majority of data center energy usage. The number and size of these units will define the total power consumption. Efficient management and adoption of green technologies in each of these areas are crucial for reducing operational costs and minimizing environmental impact.

AI’s energy price tag

The recent boom of AI has brought significant growth in data center power demand. ChatGPT’s electric appetite is substantial. Each query can consume up to 10 times more energy than a Google search. This is due to the vast computational power required to run large language models, which rely on energy-intensive data centers to generate responses in real time. Improvements to operational efficiency remain a key challenge.

AI processes (training and inference) have different power profiles. Training large models like GPT requires immense computational power over weeks or months, consuming millions of kilowatt-hours. The process of model inference, though less energy-intensive per task, scales dramatically with user demand. Unlike the one-time cost of training, inference dictates the sustained energy consumption of deployed AI systems.

The global chip wars center on high-demand GPUs, which are vital for AI. Shortages, driven by surging AI workloads, have collided with power limitations, as the global data center market strains to supply both chips and electricity. Top tech firms compete for limited GPU stock, while regions with scarce energy infrastructure struggle to host or expand AI operations, making performance and energy efficiency strategic priorities.

AI-powered data center energy demand is projected to nearly triple by 2030, driven by surging AI compute loads. Epoch AI estimates top-tier AI supercomputers could need around 9 gigawatts, comparable to a small city’s power use. In response, major tech firms are investing heavily in nuclear power, especially small modular reactors, to secure reliable, low‑carbon baseload electricity and reduce dependence on the unstable power grid.   

The new geography of power

The geography of power is shifting with the digital economy. Iceland has become a crypto mining haven due to its abundant geothermal and hydroelectric energy, cool climate, and political stability, ideal for energy-hungry mining rigs (computer systems specifically designed for cryptocurrency mining). 

Meanwhile, the US state of Texas is experiencing a data center boom driven by cheap wind and natural gas power, a deregulated energy market, and supportive policies. Its vast land, tax incentives, and growing infrastructure make it attractive to AI companies. As energy demand surges, regions offering reliable, affordable energy are emerging as key global hubs for data processing, marking a new era where energy access shapes the future of technology.

The proximity paradox forces a brutal trade-off: locate data centers near users for lightning-fast response times, or chase cheap, clean power in remote locations. Urban sites deliver millisecond advantages but drain budgets with premium energy costs, while distant facilities slash expenses but sacrifice speed. As data center capacity explodes globally, companies must choose between performance and profitability.

As AI and cloud demand soar, data centers are straining local power grids. In some regions, they’ve caused delays in housing developments or required major grid upgrades. Their massive, constant energy draw can exceed local capacity, forcing utilities to prioritize data traffic over community needs – raising concerns about infrastructure fairness and resilience. 

Infrastructure reality check

The power grid is failing. Aging infrastructure buckles under surging power demand, renewable integration challenges, and extreme weather events. The U.S. faces a $2 trillion price tag by 2030 just to achieve basic reliability and resilience. While the upfront investment is staggering, it’s a bargain compared to the catastrophic costs of widespread blackouts, climate disasters, and derailed decarbonization efforts.

Battery storage is crucial for integrating renewables like solar and wind, which produce power intermittently. However, current storage capacity lags far behind demand, creating a bottleneck. High costs, limited materials (like lithium), and slow infrastructure rollout hinder progress. This storage deficit becomes even more critical as explosive data center growth amplifies energy demands, requiring massive battery banks to ensure uninterrupted power during renewable energy fluctuations. Without large-scale storage, excess renewable energy goes unused, limiting grid reliability and large-scale clean energy adoption.

Fusion power has crossed the threshold from fantasy to reality. ITER, an international mega-project to demonstrate the feasibility of fusion energy, targets initial operations by 2025 and sustained burning plasma by the late 2030s. Firms like Tokamak Energy target pilot reactors by 2034 and grid-connected plants in the late 2030s. While technical challenges remain, this 20‑year promise could finally deliver carbon‑free energy.

How can we do better: Renewables and emerging technologies 

Renewable and green initiatives in data center powering aim to reduce environmental impact, lower carbon emissions, and improve energy efficiency. With data centers consuming vast amounts of electricity, integrating renewable energy sources like solar, wind, hydro, and geothermal helps reduce dependence on fossil fuels. Many data centers now source part or all of their energy from on-site or off-site renewable power plants.

Underwater and Arctic data centers offer natural cooling and energy efficiency, reducing reliance on power-hungry air conditioning. Microsoft’s underwater project showed lower failure rates and sustainability gains. Arctic sites, like those in Norway, use cold climates and renewable hydro power. These extreme locations help meet growing data demands with greener, cost-effective operations.

Edge computing is reshaping digital infrastructure by deploying thousands of micro data centers closer to users and devices. This reduces latency, boosts real-time processing, and eases strain on central data hubs. From smart cities to autonomous vehicles, these compact centers enable faster, localized AI and IoT applications, driving the next wave of connected innovation.

Satellite data processing shifts computation to space, reducing reliance on Earth-based data centers and their massive power needs. By processing data directly onboard satellites, only essential insights are transmitted back, saving bandwidth and energy consumption. This approach supports real-time applications like climate monitoring and defense while easing Earth’s growing digital power demands.

Finally, your smartphone is increasingly able to contribute to distributed computing. When not actively in use, these devices can contribute spare processing power to blockchain networks, AI computations, and decentralized applications. This distributed approach reduces the burden on traditional data centers while creating more resilient digital infrastructure. By harnessing the collective power of millions of consumer devices, we’re building a more efficient and democratized computing ecosystem.

Zoran Gacovski
Zoran Gacovski

Dr. Zoran Gacovski is a full professor at Mother Teresa University in Skopje, Macedonia. His areas of research are information systems, intelligent control, machine learning, and human-computer interaction.

Prof. Gacovski served as a Fulbright postdoctoral fellow in 2002 at Rutgers University.

He has published more than 300 highly technical IT articles, as well as books (available on Amazon). His portfolio can be retrieved on Google Scholar, ResearchGate, and Academia.edu.

Learn more