Categories
AI Marketing

95% Less Energy Consumption in Neural Networks Can be Achieved. Heres How [Video]

AI is booming, and so is energy consumption. According to reports, ChatGPT is probably using more than half a million kilowatt-hours of electricity to respond to some 200 million requests a day. In other words, ChatGPT consumes energy equivalent to powering 17,000 households in the USA daily. 

A research paper titled, ‘Addition is All You Need: For Energy Efficient Language Models’ mentioned that multiplying floating point numbers consumes significantly more energy than integer operations. The paper states that multiplying two 32-bit floating point numbers (fp32) costs four times more energy than adding two fp32 numbers and 37 times more than adding two 32-bit integers.

The researchers have proposed a new technique called linear-complexity multiplication (L-Mul), which solves the problem of energy-intensive floating point multiplications in large neural networks. 

Before L-Mul, neural networks typically performed computations using standard floating-point multiplication, which is computationally expensive and energy-intensive, especially for LLMs, which typically run over billions …

Watch/Read More