this post was submitted on 27 Sep 2024
1335 points (99.4% liked)

Technology

59989 readers
2106 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] wholookshere@lemmy.blahaj.zone 3 points 2 months ago* (last edited 2 months ago) (1 children)

Literally anything that requires knowing facts to inform writing. This is something LLMs are incapable of doing right now.

Just look up how many R's are in strawberry and see how chat gpt gets it wrong.

[–] boonhet@lemm.ee 1 points 2 months ago (1 children)

Okay what the hell is wrong with it

It took me three times to convince it that there's 3 r's in strawberry...

[–] wholookshere@lemmy.blahaj.zone 1 points 2 months ago* (last edited 2 months ago)

Because that's not how LLMs work.

When you form a sentence you start with an intent.

LLMs start with the meaning you gave it, and tries to express something similar to you.

Notice how intent, and meaning aren't the same. Fact checking has nothing to do with what a word means. So how can it understand what is true?

All it did was take the meaning of looking for a number and strawberries and ran it's best guess from that.