this post was submitted on 24 Feb 2024
239 points (91.9% liked)

Technology

59404 readers
4170 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] asuka@sh.itjust.works 0 points 8 months ago (2 children)

I think it's a big mistake to think that because the most basic LLMs are just autocompletes, or that because LLMs can hallucinate, that what big LLMs do doesn't constitute "thinking". No, GPT4 isn't conscious, but it very clearly "thinks".

It's started to feel to me like current AIs are reasonable recreations of parts of our minds. It's like they're our ability to visualize, to verbalize, and to an extent, to reason (at least the way we intuitively reason, not formally), but separared from the "rest" of our thought processes.

[–] fidodo@lemmy.world 3 points 8 months ago

Depends on how you define thinking. I agree, LLMs could be a component of thinking, specifically knowledge and recall.

[–] erwan@lemmy.ml 2 points 8 months ago

Yes, as Linus Torvalds said humans are also thinking like autocomplete systems.