Hugging Face: 5 ways enterprises can slash AI costs without sacrificing performance
1 min read
Summary
Sasha Luccioni, AI and climate lead at Hugging Face, has said that data scientists should be focusing on making AI smarter, not simply increasing computational power, according to a VentureBeat report.
Task-specific or distilled models can require as little as 20%-30% of the energy of a general-purpose model and are often more efficient for specific tasks.
Open-source models, which do not have to be trained from scratch, are more efficient; Hugging Face has launched an “AI Energy Score” to rate the efficiency of different models, as an “Energy Star” equivalence.
Default settings increase costs and compute requirements as models do more work than is required; adjusting batch sizes and memory usage can lead to significant increases in efficiency.
Changing the mindset that “more compute is better” can lead to better results through smarter architectures and better-curated data, rather than simply bigger GPU clusters.