Is there a silver bullet to help data centers cope with the energy demands of AI?

There’s no denying the impact AI is already having on data centers and energy consumption. If nothing is done, the situation will only get worse. A recent IDC report shows that as AI adoption explodes, the energy required to support AI workloads is expected to increase, with electricity consumption in data centers expected to more than double between 2023 and 2028. AI-powered workloads alone are expected to increase at a staggering rate. compound annual growth rate (CAGR) of 44.7% until 2027, with energy demand reaching a massive 146.2 TWh. The implications are stark: data centers, which already account for 46% of corporate energy expenditure, could soon become unsustainable.

That obviously can’t happen. But with AI workloads rapidly increasing, data centers must evolve quickly and manage the prospect of a new energy crisis, with rising electricity prices driven by geopolitical instability in the Middle East. The growing influence of AI tools across industries – from healthcare to financial services – is undeniable. However, an AI-powered search consumes 100 times more energy than a non-AI-powered search, while building basic AI models can consume enough energy to power 20,000 homes for six months.

James Sturrock

Director of Systems Engineering at Nutanix.

A solution?

A report from Atlantic Ventures, Improving Sustainability in Data Centers 2024, suggests a solution, showing how next-generation data center architectures, such as hyperconverged infrastructure (HCI), can reduce energy consumption, lower carbon emissions and save costs in the EMEA region can achieve. The report finds that modernizing data centers with HCI could save up to 19 million tCO2e in the EMEA region in just six years, equivalent to the emissions of almost 4.1 million cars. It could also save €25 billion by 2030 through improved energy and operational efficiency.

As organizations integrate AI into all operations and come to terms with the enormity of energy consumption, HCI can reduce the risk of rising costs and ensure sustainability goals are not missed. But it’s not just about HCI, it’s about how organizations work with AI. The focus must shift to optimizing where and how AI workloads are processed, using modernization to manage workloads more intelligently. This makes so much more sense than just continuing to build more energy-efficient data centers.

This is important because we need to take into account how AI works and where the demand for power will increase. For example, while many organizations are fascinated by the energy consumption required to train fundamental AI models, it is inference (the real-time decision making that AI performs) where most of the energy is spent.

Basic model training happens once, but inference is a continuous process that happens millions of times, especially in AI-driven applications such as fraud detection or predictive maintenance. Optimizing inference, especially at the edge, could be the silver bullet data centers need to manage AI power needs more efficiently.

Switch to sustainable energy

As the IDC report suggests, more data center providers should turn to renewable energy sources, but they should also rethink their infrastructure. Hybrid cloud, edge computing, and on-premise systems provide a way to balance AI’s energy demands by more intelligently distributing workloads.

For example, processing data closer to the source with edge computing reduces the energy required to send large data sets back and forth from centralized servers. Meanwhile, hybrid cloud computing environments can handle compute-intensive AI training tasks, allowing real-time inference to occur on-premise or at the edge.

Edge computing also plays a crucial role by processing data closer to where it is generated, such as in stores or IoT devices. This not only improves response times, but also significantly reduces the energy required for inference.

Modern infrastructure is critical to managing AI power demands, and a containerized platform, designed to handle both CPUs and GPUs, is necessary to efficiently run AI workloads. Storage also becomes critical, as AI typically deals with unstructured data such as files and objects. By investing in high-performance storage systems and optimized compute stacks, companies can significantly reduce the energy required to run AI applications.

Furthermore, the ability to measure and manage energy consumption is critical. Platforms that provide real-time visibility into energy consumption enable data centers to optimize every stage of AI processing – from training to inference. Even a 10% improvement in energy efficiency can lead to significant savings, according to the IDC report.

Real-time decision making

Rather than focusing solely on the enormous energy costs of training fundamental models, companies should pay attention to how often these models are used in real-time decision making. Compressing models, refining their structure, and running them on platforms designed for efficiency will be critical in reducing AI’s overall energy footprint. For example, we’ve developed container platforms and high-performance storage solutions specifically tailored for AI inference, giving companies a way to optimize their AI workloads and moderate their energy needs.

The true cost of AI is no longer just about performance and innovation, but about the energy required to sustain it. As organizations ramp up their AI initiatives, the question is not whether they can afford to invest in AI, but whether they can afford the energy it consumes. With hybrid infrastructure and a focus on efficient inference, companies have a way to moderate this energy wave. Otherwise, those who ignore this reality will soon find their data centers at the mercy of an AI energy crisis.

We have listed the best colocation providers.

This article was produced as part of Ny BreakingPro’s Expert Insights channel, where we profile the best and brightest minds in today’s technology industry. The views expressed here are those of the author and are not necessarily those of Ny BreakingPro or Future plc. If you are interested in contributing, you can read more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Related Post