That ‘cheap’ open-source AI model is actually burning through your compute budget
1 min read
Summary
Open-source AI models consume significantly more computational resources than their closed counterparts when performing identical tasks, according to research by AI firm Nous Research.
The study, which examined 19 AI models, found that open models use 1.5 to 4 times more tokens, the basic units of AI computation, than closed models, with the inefficiency particularly noticeable in large reasoning models.
This additional complexity of open models “can undermine their cost advantages,” the report said, thus casting doubt over the assumption that open-source models are more economical.
It also follows warnings from OpenAI CEO Sam Altman that the cost of AI models is unsustainable.
However, the study also found that closed-source models had been optimised to use fewer tokens, thus reducing inference costs, while open models had increased their usage for newer versions to prioritise better reasoning performance.