in

Researchers develop low energy AI for LLMs on lightbulb’s power. #innovation

efficient llm hero

Researchers at the University of California, Santa Cruz have developed a large language model (LLM) that is 50 times more efficient than traditional models, running on custom hardware that only consumes 13 watts of power. This breakthrough allows the LLM to compete with established models like Meta’s Llama. The team focused on reducing energy consumption by shifting away from matrix multiplication to using ternary numbers for summing instead.

By running the LLM on custom hardware with field-programmable gate arrays (FPGA), the researchers were able to achieve significant efficiency gains. They believe that further optimizations can be made to improve efficiency even more. This innovation is significant in the growing field of AI, where demand continues to rise.

The researchers hope that big players in the industry will take note of their work and consider implementing similar strategies to improve efficiency in AI models in the future. This breakthrough represents a step forward in creating more energy-efficient and sustainable AI technologies.

Source link

Source link: https://hothardware.com/news/llms-on-a-lightbulbs-power

What do you think?

Leave a Reply

GIPHY App Key not set. Please check settings

DSPy. Auto-prompt-engineering with LLM | by Xin Cheng | Jun, 2024

Automated engineering using LLM for DSP applications. #DigitalSignalProcessing

IMF Releases AI Preparedness Index (AIPI) Data Dashboard

IMF unveils AIPI Data Dashboard, measuring global AI readiness #AIPI