Tuesday, April 29, 2025

New method significantly reduces AI energy consumption

Share

The SuperMUC-NG on the Leibniz Supercomputing Centre is the eighth quickest laptop on the planet. Credit score: Veronika Hohenegger, LRZ

AI functions equivalent to massive language fashions (LLMs) have develop into an integral a part of our on a regular basis lives. The required computing, storage and transmission capacities are offered by knowledge facilities that devour huge quantities of power. In Germany alone, this amounted to about 16 billion kWh in 2020, or round 1% of the nation’s whole power consumption. For 2025, this determine is anticipated to extend to 22 billion kWh.

The arrival of extra complicated AI functions within the coming years will considerably improve the calls for on knowledge heart capability. These functions will burn up big quantities of power for the coaching of neural networks. To counteract this development, researchers on the Technical College of Munich (TUM) have developed a coaching methodology that’s 100 instances quicker whereas attaining accuracy similar to present procedures. This can considerably scale back the power consumption for coaching.

They introduced their research on the Neural Info Processing Methods convention (NeurIPS 2024), held in Vancouver Dec. 10–15.

The functioning of neural networks, that are utilized in AI for such duties as picture recognition or language processing, is impressed by the way in which the human mind works. These networks encompass interconnected nodes known as synthetic neurons. The input signals are weighted with sure parameters after which summed up. If an outlined threshold is exceeded, the sign is handed on to the following node.

To coach the community, the preliminary choice of parameter values is often randomized, for instance, utilizing a traditional distribution. The values are then incrementally adjusted to step by step enhance the community predictions. Due to the numerous iterations required, this coaching is extraordinarily demanding and consumes a whole lot of electrical energy.

Parameters chosen in response to chances

Felix Dietrich, a professor of Physics-enhanced Machine Studying, and his workforce have developed a brand new methodology. As an alternative of iteratively figuring out the parameters between the nodes, their strategy makes use of chances. Their probabilistic methodology is predicated on the focused use of values at essential areas within the coaching knowledge the place massive and fast modifications in values are going down.

The target of the present research is to make use of this strategy to accumulate energy-conserving dynamic programs from the info. Such programs change over the course of time in accordance with sure guidelines and are present in local weather fashions and in monetary markets, for instance.

“Our method makes it possible to determine the required parameters with minimal computing power. This can make the coaching of neural networks a lot quicker and, in consequence, extra power environment friendly,” says Dietrich. “In addition, we have seen that the accuracy of the new method is comparable to that of iteratively trained networks.”

Quotation:
New methodology considerably reduces AI power consumption (2025, March 6)
retrieved 6 March 2025
from https://techxplore.com/information/2025-03-method-significantly-ai-energy-consumption.html

This doc is topic to copyright. Other than any honest dealing for the aim of personal research or analysis, no
half could also be reproduced with out the written permission. The content material is offered for data functions solely.



Our Main Site

Read more

More News