Thank you for that explanation. My regex impaired ass thought he wanted to hurt generation[x|y|z].
I'm like "what'd we ever do to you?"
Thank you for that explanation. My regex impaired ass thought he wanted to hurt generation[x|y|z].
I'm like "what'd we ever do to you?"
Switched from Kubuntu to Mint + KDE last week. Very happy indeed.
At least the article points out that this is a Wall Street valuation, meaning it's meaningless in reality, the company doesn't have that much money, nor is it actually worth that much. In reality, Nvidia's tangible book value (plant, equipment, brands, logos, patents, etc.) is $37,436,000,000.
$37,436,000,000 / 29,600 employees = $1,264,729.73 per employee
Which isn't bad considering the median salary at Nvidia is $266,939 (up 17% from last year).
It sounds like the processor is the real limitation. Plenty of stuff from Windows XP era and before ran in less than 512MB.
"Sir, they've given us a list of their demands, but I can't read this ... this chicken-scratch."
"As God is my witness..."
When I was in Korea, I leaned that chickens can (sort of) fly. They can flap their wings hard enough to get from the ground to a tree branch maybe 8 feet or so off the ground, and safely back down.
And I've heard chickens tasted better back in the old days. A bird that eats grubs, worms, grasshoppers, frogs, snakes, etc tastes different than one that just eats chickenfeed all day.
Renegade Cut had a pretty good video about her. https://piped.video/watch?v=HC4K1mx0SPQ. I was mostly wrong about her. I didn't realize they became friends later. That's a pretty big arc for only one season.
Maybe we don't live in the worst possible universe. Madonna and Will Smith in the Matrix, everybody using the Hulk Hogan Grill, Stallone as Axel Foley, OJ as the Terminator. I guess I'm ok with where we are now.
Awesome. I'd heard that Pat was one of Redd's old friends from the "Chitlin' Circuit" era of comedy, but I've never actually seen him do standup.
Probably better to ask on !localllama@sh.itjust.works. Ollama should be able to give you a decent LLM, and RAG (Retrieval Augmented Generation) will let it reference your dataset.
The only issue is that you asked for a smart model, which usually means a larger one, plus the RAG portion consumes even more memory, which may be more than a typical laptop can handle. Smaller models have a higher tendency to hallucinate - produce incorrect answers.
Short answer - yes, you can do it. It's just a matter of how much RAM you have available and how long you're willing to wait for an answer.
He was a great comedian but his "I don't vote" shtick really fucked this country over. Moderate, reasonable, people stayed home, thinking 'it doesn't matter', and conservative dickheads took over at the polls. 1996