em360tech image

Not many would have heard of AI-powered consumer-targeted tools five years ago. However, last year witnessed a boom in artificial intelligence (AI) tools such as OpenAI's ChatGPT, Google’s Gemini, Meta AI’s Llama and YouTube are launching a series of AI generative tools.

While we’ve seen a boost in consumer usage of AI-backed tools, have you really questioned how the computational power is achieved to support such heightened expansion? This is where data centres come into play.

According to Cisco, a data centre is a physical facility that organisations use to house their critical applications and data. The facility is designed to harbour “a network of computing and storage resources that enable the delivery of shared applications and data.”

It covers various features to support information flow and storage including routers, switches, firewalls, storage systems, servers, and application-delivery controllers.

However, the International Energy Agency (IEA) reported earlier this year that electricity demand for global data centres is likely to double from 2022 to 2026, vastly owing to the growth of AI. 

AI computational power surging energy demands

Cory Lopes-Warfield, editor-in-chief at Tech For Good and co-founder CXO of eight startups told EM360Tech, that AI is trained on data and needs data that’s not shared or common to differentiate from other models, reduce hallucinations, and a lot more. Having proprietary data is what makes models unique, and better.

“However, AI is machine learning (ML) that just predicts the next token - that’s the compute,” he added. “It’s a series of complex mathematical equations, and they all need to be calculated using compute. That said chips are getting way better and compute faster using less energy.

The power needed to run the data centres driven by AI growth is resulting in emissions contributing to climate change.

Google’s greenhouse gas emissions (GHGs) seemed to have soared to a whopping 48% in just five years as a consequence of building data centres for artificial intelligence.

The tech giant stated in its annual environmental report that emissions rose by 13% in 2023 injust a year reaching 14.3m metric tons. This could be a setback in the firm’s goal to reach net zero across all operations and value chains by 2030.

Google says, “Our net-zero goal is supported by an ambitious clean energy goal to operate our offices and data centres on 24/7 carbon-free energy, such as solar and wind.”

Hindering clean energy goals

Circling back to the role of AI in data centres, Bloomberg reported in June 2024 that the dramatic increase in power demands from Silicon Valley’s growth-at-all-costs approach to AI also threatens to upend the energy transition plans of entire nations and the clean energy goals of trillion-dollar tech companies.

data centre for ai innovation could increase carbon emissions

John Ketchum, chief executive officer at NextEra Energy Inc. told Bloomberg that power demand is projected to increase by 40% over the next 20 years in the US mainly because of the boom in the demand for data centers.

He says AI is the reason behind the boom due to the energy demands required to train models in addition to the inference process by which AI concludes data it hasn’t seen before. “It’s 10 to 15 times the amount of electricity.”

Warfield told EM360Tech that the most resource-intensive AI technology is Quantum AI because of the complexity behind quantum computers making it necessary to implement it correctly. Text-to-video is another resource-intensive AI-powered technology.

However, the entrepreneur added that AI’s carbon footprint is being reduced perpetually. “It can be used to forecast and otherwise advert climate risks and risks to the environment technology poses.”

When asked about the increasing demand for data centres due to AI impacting carbon emissions and overall energy consumption, Warfield said that it depends wholly on the data centres.

“TermoBuild is an example of buildings that generate their own energy and heating and cooling and cloud storage - data centres built by them don’t impact carbon outputs and so on. Therefore, it’s incumbent upon the data centres to ‘do it right,’” Warfield emphasised.

TermoBuild is an engineering services company based in the US and Canada that caters to delivering sustainable high-efficiency buildings. They are known for assisting low-energy building solutions aiming to decarbonise construction and tackle conventional building design as
seen in Canada, India, and the USA.

The company achieved this by integrating sustainable solutions into building design such as the consolidation of Thermal Storage Ventilation which has been estimated to save operational energy between 35% to 50% over standard building systems. 

AI to the rescue: Supporting sustainability efforts

“Groq LPU chips (made in the USA), small language models, UBC (universal basic compute), and autonomous agents that can deployed locally,” Warfield says are some strategies he has observed to be mitigating the environmental impact of AI-driven data centers.

The AI expert also emphasised that the technology can be used to advantage. “AI can run the centres and be programmed to run them sustainably and with Sustainable Development Goals, and much more.”

ai environmental impact on data centre
Image Credit: Beenish | Adobe Stock

When asked about the business's ethical responsibility to manage the environmental impact caused by AI technologies, Warfield, who has co-founded several tech startups, said, absolutely, firms are “100%” responsible.

“Companies will ruin humanity and this planet with their new innovation without oversight, accountability, guardrails, and ethics prioritized,” he said. “AI can actually be net-positive (it can).”

He added that based on his experience there is a need to implement blockchain. “Try the AI tokens to the blockchain using smart contracts.”

To balance the benefits of AI with sustainability goals, Warfield advises leaning into the technology, having an intention and mindful roadmap, and hiring experts like him “to get it right” rather than risk getting it wrong or underleveraging it.

Earlier this week Google announced plans to build nuclear reactors in collaboration with Kairos Power to power the energy consumption of its data centres and drive AI innovation, EM360Tech reported.

The company proclaims to be undertaking the world’s first corporate agreement to purchase nuclear energy from multiple small modular reactors (SMR) which will be developed by Kairos Power.

Michael Terrell, the senior director for energy and climate at Google stated that the initial phase of work is intended to bring Kairos Power’s first SMR online quickly and safely by 2030. This would be followed by additional reactor deployments through 2035. 

He further noted the need for new electricity sources to support AI technologies that are powering major scientific advances, improving services for businesses and customers, and driving national competitiveness and economic growth.

“This agreement helps accelerate a new technology to meet energy needs cleanly and reliably, and unlock the full potential of AI for everyone,” he said.

In a new study, scientists have developed a new AI model called Perseus to reduce the excessive energy consumption that goes into training large language models (LLMs).  It aims reduce the “energy bloating” in LLMs. 

Upon testing, the study found that the Perseus training model reduced the energy consumption by up to 30% when training large models including GPT-3 and Bloom “without any throughput loss or hardware modification.”