Researchers Develop Energy Efficient AI Model
June 28, 2024
Researchers at the University of California, Santa Cruz have developed a way to run a large language model using just 13 watts of power, a significant improvement over traditional methods that require around 700 watts. The team achieved this efficiency by using a more efficient technique for calculation known as ternary values. This approach reduces computation to simple addition, making it much less hardware-intensive.
View Article