links, etc
Smaller, Cheaper, Faster, Sober | Drew Breunig
i’ve been intrigued by the small models that run fast and are increasingly useful for mission specific tasks. this seems like the thing that will be durably useful in all of this AI flotsam.
Enterprises are foregoing general models for open ones trained for single tasks.
And it’s not just foundation training, but fine-tuning training. Teams are learning that small models trained for a single purpose can outperform the best general models out-of-the-box. Who cares how good GPT-4o is at answering random questions when all we need a model to do is one specific task in a pipeline? By training smaller models, enterprises get to differentiate themselves and run cheaper, more accurate pipelines.
Unintentionally troubleshooting a new way to filter traffic
i won’t spoil it for you. it’s a great read on some of the non-obvious elements that lurk in configurations.
one of the better reads on the topic of media toolery