this post was submitted on 24 Jun 2025
635 points (98.9% liked)

Technology

71953 readers
3293 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] nednobbins@lemmy.zip 0 points 1 day ago (1 children)

Human learning requires understanding, which AI is not capable of.

How could anyone know this?

Is there some test of understanding that humans can pass and AIs can't? And if there are humans who can't pass it, do we consider then unintelligent?

We don't even need to set the bar that high. Is there some definition of "understanding" that humans meet and AIs don't?

[–] gaja@lemm.ee 1 points 1 day ago (1 children)

It's literally in the phrase "statically optimized." This is like arguing for your preferred deity. It'll never be proven but we have evidence to make our own conclusions. As it is now, AI doesn't learn or understand the same way humans do.

[–] nednobbins@lemmy.zip 1 points 1 day ago (1 children)

So you’re confident that human learning involves “understanding” which is distinct from “statistical optimization”. Is this something you feel in your soul or can you define the difference?

[–] gaja@lemm.ee 0 points 1 day ago* (last edited 1 day ago) (1 children)

Yes. You learned not to touch a hot stove either from experience or a warning. That fear was immortalized by your understanding that it would hurt. An AI will tell you not to touch a hot stove (most of the time) because the words "hot" "stove" "pain" etc... pop up in its dataset together millions of times. As things are, they're barely comparable. The only reason people keep arguing is because the output is very convincing. Go and download pytorch and read some stuff, or Google it. I've even asked deepseek for you:

Can AI learn and understand like people?

AI can learn and perform many tasks similarly to humans, but its understanding is fundamentally different. Here’s how AI compares to human learning and understanding:

1. Learning: Similar in Some Ways, Different in Others

  • AI Learns from Data: AI (especially deep learning models) improves by processing vast amounts of data, identifying patterns, and adjusting its internal parameters.
  • Humans Learn More Efficiently: Humans can generalize from few examples, use reasoning, and apply knowledge across different contexts—something AI struggles with unless trained extensively.

2. Understanding: AI vs. Human Cognition

  • AI "Understands" Statistically: AI recognizes patterns and makes predictions based on probabilities, but it lacks true comprehension, consciousness, or awareness.
  • Humans Understand Semantically: Humans grasp meaning, context, emotions, and abstract concepts in a way AI cannot (yet).

3. Strengths & Weaknesses

AI Excels At:

  • Processing huge datasets quickly.
  • Recognizing patterns (e.g., images, speech).
  • Automating repetitive tasks.

AI Falls Short At:

  • Common-sense reasoning (e.g., knowing ice melts when heated without being explicitly told).
  • Emotional intelligence (e.g., empathy, humor).
  • Creativity and abstract thinking (though AI can mimic it).

4. Current AI (Like ChatGPT) is a "Stochastic Parrot"

  • It generates plausible responses based on training but doesn’t truly "know" what it’s saying.
  • Unlike humans, it doesn’t have beliefs, desires, or self-awareness.

5. Future Possibilities (AGI)

  • Artificial General Intelligence (AGI)—a hypothetical AI with human-like reasoning—could bridge this gap, but we’re not there yet.

Conclusion:

AI can simulate learning and understanding impressively, but it doesn’t experience them like humans do. It’s a powerful tool, not a mind.

Would you like examples of where AI mimics vs. truly understands?

[–] nednobbins@lemmy.zip 0 points 22 hours ago

That’s a very emphatic restatement of your initial claim.

I can’t help but notice that, for all the fancy formatting, that wall of text doesn’t contain a single line which actually defines the difference between “learning” and “statistical optimization”. It just repeats the claim that they are different without supporting that claim in any way.

Nothing in there, precludes the alternative hypothesis; that human learning is entirely (or almost entirely) an emergent property of “statistical optimization”. Without some definition of what the difference would be we can’t even theorize a test