With enterprises of all sizes optimising artificial intelligence (AI) into operations, the sheer weight of powering this rapidly advancing technology is placing a cumbersome burden on data centres.
The power consumption of data centres has gone up immensely owing to the intensely-wide AI adoption.
In fact, according to James Schneider, a senior equity research analyst at Goldman Sachs Research, they forecast global power demand from data centres to increase 50 per cent by 2027 and by as much as 165 per cent by the end of the decade (compared with 2023).
Consequently, companies are augmenting data centres expansion with facilities specifically designed to support the AI-driven computational power.
This article tells you everything about an AI data centre, what it is, how much power it consumes, how to make it more energy-efficient and the best AI data centres in the world.
What is an AI Data Centre?
An AI data centre is a type of facility specifically built to manage the huge amount of computational demands of artificial intelligence (AI) workloads.

Typically, data centres support general-purpose computing, however, AI data centres are designed to power intensive processing needed to train and deploy complex machine learning (ML) models.
Such an AI-driven data centre houses high-performing hardware like GPUs (graphics processing units) and TPUs (tensor processing units), which are essential for accelerating AI computations. AI data centres also deploy advanced cooling solutions such as liquid cooling and immersion cooling to maintain a stable and optimal operating temperature.
How Much Power is Needed for AI Data Centres?
AI data centres require an exorbitant amount of power which has in fact increasingly become a huge concern. This is because AI functioning requires an intense amount of computing power. Computing as a result draws insanely high amounts of electricity. As a result AI data centers are expected to consume vast terawatt-hours annually.
Reports indicate that computing power and server resources, along with cooling systems, are the primary drivers of electricity consumption in data centers, each accounting for roughly 40 percent of the total power usage.
According to RAND, unprecedented demand for AI data centers is straining U.S. power grids. Recent trends indicate that AI data centers could require 68 gigawatts of power globally by 2027 — almost equivalent to California's 2022 total capacity.
Goldman Sachs Research also forecasted that global power demand from data centers will increase by 50 percent by 2027, and by as much as 165 percent by the end of the decade (compared with 2023). This spotlights the rapid acceleration of power consumption driven by AI.
The growth of power is especially fueled by the proliferation of generative AI. It requires a significant amount of processing power which further prompts the need for sophisticated cooling systems. Such systems help balance the operating temperatures.
This is also the reason why scientists, engineers and now certain enterprises are directing their resources on developing energy-efficient AI hardware and software, as well as exploring alternative energy sources to mitigate the environmental impact.
Read: What is Data Centre Automation? Why is it important?
How Can AI Data Centres Become More Energy-Efficient?
To enhance the energy efficiency of AI data centers, businesses should execute a multifaceted strategy that addresses both hardware and software. This includes leveraging hardware design for energy efficiency, such as developing specialised AI chips and advanced cooling systems.
Furthermore, software optimisations, like refining AI algorithms and utilising edge computing, are crucial for reducing computational load. Critically, businesses should also prioritise the integration of renewable energy sources, customising their approach to the specific location and available resources of each data centre.
Techniques like model pruning (removing unnecessary connections), quantisation (reducing the precision of numerical values), and knowledge distillation (transferring knowledge from a large model to a smaller one) majorly minimise computational load.
AI itself is also being utilised to optimize the AI. AI is being used to analyse data center workflow, and then to make adjustments to the workflow, to make the processes more efficient. However, it makes one question whether using AI to solve climate change is contradicting climate change itself.
Rolf Bienert, Technical & Managing Director of the OpenADR Alliance told Shubhangi Dua, Podcast Host at EM360Tech that data centres should be adopting innovative solutions such as smart grids, microgrids, and virtual power plants. These hold immense potential for managing energy distribution efficiently and sustainably.
Also Read: The Top 10 Data Centre Infrastructure Management Solutions for 2020
Comments ( 0 )