Thank you for that explanation. My regex impaired ass thought he wanted to hurt generation[x|y|z].
I’m like “what’d we ever do to you?”
Technology fan, Linux user, gamer, 3D animation hobbyist
Also at:
Thank you for that explanation. My regex impaired ass thought he wanted to hurt generation[x|y|z].
I’m like “what’d we ever do to you?”
But I thought they were “very fine people™”
Awesome. I’d heard that Pat was one of Redd’s old friends from the “Chitlin’ Circuit” era of comedy, but I’ve never actually seen him do standup.
Probably better to ask on [email protected]. Ollama should be able to give you a decent LLM, and RAG (Retrieval Augmented Generation) will let it reference your dataset.
The only issue is that you asked for a smart model, which usually means a larger one, plus the RAG portion consumes even more memory, which may be more than a typical laptop can handle. Smaller models have a higher tendency to hallucinate - produce incorrect answers.
Short answer - yes, you can do it. It’s just a matter of how much RAM you have available and how long you’re willing to wait for an answer.
Fun fact: The US Gov’t never broke even on that bailout https://www.nbcnews.com/businessmain/u-s-exits-gm-stake-taxpayers-lose-10-5-billion-2D11716261