This new method could reduce the energy needs of AI applications by 95%, but could also require entirely new forms of hardware
- Engineers unveil alternative to floating-point multiplication
- New method can reduce AI energy consumption by up to 95%
- But a new calculation method would also require alternative hardware for existing GPUs
As artificial intelligence (AI) technologies evolve, demand for computing power – and therefore electricity – has skyrocketed, as have concerns about its energy consumption.
Now engineers at BitEnergy AI are offering a potential solution: a new calculation method that could reduce the energy needs of AI applications by as much as 95%.
Linear complexity multiplication could apparently reduce the energy needs of AI applications by 95% by changing the way AI calculations are performed, moving away from the traditional use of floating point multiplication (FPM) in favor of adding integers.
From floating point multiplication to linear complexity multiplication
FPM is typically used in AI calculations because it allows systems to process very large or small numbers with high precision. However, it is also one of the most energy-intensive operations in AI processing. The precision that FPM provides is necessary for many AI applications, especially in areas such as deep learning, where models require detailed calculations.
The researchers claim that despite reducing energy consumption, there is no impact on the performance of AI applications. Although the linear complexity multiplication method is promising, its adoption faces certain challenges.
An important disadvantage is that the new technology requires different hardware than what is currently used. Most AI applications today run on hardware optimized for floating-point arithmetic, such as GPUs from companies like Nvidia. The new method would require redesigned hardware to function effectively.
The team notes that the hardware required for the method has already been designed, built and tested. However, this new hardware will require a license and it is not clear how this hardware will be made available to the wider market.
Estimates suggest that ChatGPT alone currently uses approximately 564 MWh of electricity daily, enough to power 18,000 US homes. Some critics predict that within a few years, AI applications could consume around 100 TWh of electricity annually, putting them on par with the energy-hungry Bitcoin mining industry.
Via TechXplore