AI usage power consumption is now similar to many small countries
New figures from the French energy management company Schneider Electric claim that artificial intelligence (AI) now consumes an estimated 4.3 GW of energy globally, almost as much as some small countries.
As technology adoption increases, so will energy consumption. Schneider Electric estimates that AI will be responsible for between 13.5 and 20 GW by 2028, representing a compound annual growth rate of 26 to 36%.
The research also reveals the power intensity of data centers in general, presenting an eye-opening reality that we need to prepare for in two important ways: upgrading infrastructure and improving efficiency.
AI contributes to data center power consumption
Currently, artificial intelligence makes up only 8% of the energy consumption of a typical data center, which totals 54 GW. By 2028, the company expects data center usage to increase to 90 GW, with artificial intelligence accounting for approximately 15-20% of this.
The company also noted the current 20:80 split represented by AI training and inference respectively, which is expected to become heavier in the coming years.
The article also mentions the requirement for cooling; Excess heat poses a safety risk, but can also lead to premature component failure. Cooling not only requires an additional amount of electricity to power the process, but this is often also associated with high water consumption. Data centers have long been criticized for their use of natural resources, which in some cases requires them to divert or otherwise modify waterways. This is because air cooling is not sufficient for large clusters, which would otherwise become dangerously hot.
Looking ahead, Schenider says that accurately predicting power consumption will become more challenging as high-energy training makes room for inference workloads, which can have much more variable power demands.
The company also offers advice to data center operators looking to take advantage of the latest AI hardware: by moving from a conventional 120/208V distribution to 240/415V they should be able to accommodate the high power density of AI workloads .
It is clear that upgrading infrastructure must go hand in hand with an overhaul of the current trajectory to manage energy consumption and make cloud computing and AI workloads even more efficient.