Turns out, you don’t need a trillion-parameter model to get things done. Smaller, focused AIs are stealing the spotlight for real business impact. When it comes to artificial intelligence, bigger often looks better. Tech giants tout models with billions — or even trillions — of parameters, promising that these digital behemoths can do everything from solving equations to writing code to producing near-scientific research. The idea behind the promise is clear: If you want cutting-edge results, reach for the cannon. But bigger is not always better (no pun intended…) — as bigger usually means increased complexity and reduced flexibility. Slowly, companies are beginning to realize that a trillion-parameter model isn’t always the best solution for their business — not every AI solution needs a giant LLM. A more focused approach bears a promise of leading to better results. Small and specialized models tuned for specific tasks on relevant data are gaining traction. Fewer resources and better customization and control — what’s not to love? But there seems to be a misalignment between the actual beneficial outcome and the promise of the giants. Since the release of ChatGPT in November 2022, the models have gotten only bigger and bigger. Click here to read more of what Ilia Badeev, head of data science at Trevolution Group, said.