this post was submitted on 23 Dec 2023
176 points (87.6% liked)

Technology

59665 readers
2768 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] huginn@feddit.it 1 points 11 months ago (3 children)

Unless you want to call your predictive text on your keyboard a mind you really can't call an LLM a mind. It is nothing more than a linear progression from that. Mathematically proven to not show any form of emergent behavior.

[–] kogasa@programming.dev 3 points 11 months ago (1 children)

No such thing has been "mathematically proven." The emergent behavior of ML models is their notable characteristic. The whole point is that their ability to do anything is emergent behavior.

[–] huginn@feddit.it 1 points 11 months ago* (last edited 11 months ago) (1 children)

Here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

The field changes fast, I understand it is hard to keep up

[–] kogasa@programming.dev 0 points 11 months ago (1 children)

Sure, if you define "emergent abilities" just so. It's obvious from context that this is not what I described.

[–] huginn@feddit.it 0 points 11 months ago (1 children)

Their paper uses industry standard definitions

[–] kogasa@programming.dev 2 points 11 months ago

Their paper uses terminology that makes sense in context. It's not a definition of "emergent behavior."

[–] MxM111@kbin.social 2 points 11 months ago* (last edited 11 months ago) (1 children)

I do not think that it is “linear” progression. ANN by definition is nonlinear. Neither I think anything is “mathematically proven”. If I am wrong, please provide a link.

[–] huginn@feddit.it 2 points 11 months ago (1 children)

Sure thing: here's a white paper explicitly proving:

  1. No emergent properties (illusory due to bad measures)
  2. Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

[–] MxM111@kbin.social 2 points 11 months ago* (last edited 11 months ago)

Thank you. This paper though does not state that there are no emergent abilities. It only states that one can introduce a metric with respect to which the emergent ability behaves smoothly and not threshold-like. While interesting, it only suggests that things like intelligence are smooth functions, but so what? Some other metrics show exponential or threshold dependence and whether the metric is right depends only how one will use it. And there is no law that emerging properties have to be threshold like. Quite the opposite - nearly all examples in physics that I know, the emergence appears gradually.

[–] General_Effort@lemmy.world 0 points 11 months ago (1 children)

It is obvious that you do not know what either "mathematical proof" or "emergence" mean. Unfortunately, you are misrepresenting the facts.

I don't mean to criticize your religious (or philosophical) convictions. There is a reason people mostly try to keep faith and science separate.

[–] huginn@feddit.it 1 points 11 months ago

Here's a white paper explicitly proving:

No emergent properties (illusory due to bad measures)

Predictable linear progress with model size

https://arxiv.org/abs/2304.15004

The field changes fast, I understand it is hard to keep up