this post was submitted on 21 Aug 2023
265 points (94.3% liked)
Technology
59329 readers
5064 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
you would think that a "language model" would have "connotation" high on its list of priorities - being that is a huge part of the form and function of language.
Not how it works.
It’s just a fancy version of that “predict the next word” feature smartphones have. Like if you just kept tapping the next word.
They don’t even have real parameters, only black box bullshit hidden parameters.
I know, I was pointing out the irony.
I'm convinced it's only purpose is actually to give tech C-level and VPs some bullshit to say for roughly 18-36 months now that "blockchain" and "pandemic disruption" are dead.
Exactly correct, I agree. LLMs will change the world, but 90% of purported use cases are nothing but hot air.
But when you can tell your phone “go find a picture of an eggplant, put a smiley face on it, and send it to Bill”, that’s going to be pretty neat. And it’s coming in the next decade. Of course that requires a different model than we have now (text to instruction, not text to text). But it’s coming.
LLMs don't know what words mean, much less the connotations they have.