stochastic parrots. all of them. just upgraded “soundex” models.
this should be no surprise, of course!
This is a most excellent place for technology news and articles.
stochastic parrots. all of them. just upgraded “soundex” models.
this should be no surprise, of course!
This has been known for years, this is the default assumption of how these models work.
You would have to prove that some kind of actual reasoning capacity has arisen as... some kind of emergent complexity phenomenon.... not the other way around.
Corpos have just marketed/gaslit us/themselves so hard that they apparently forgot this.
Thank you Captain Obvious! Only those who think LLMs are like "little people in the computer" didn't knew this already.
Employers who are foaming at the mouth at the thought of replacing their workers with cheap AI:
🫢
This sort of thing has been published a lot for awhile now, but why is it assumed that this isn't what human reasoning consists of? Isn't all our reasoning ultimately a form of pattern memorization? I sure feel like it is. So to me all these studies that prove they're "just" memorizing patterns don't prove anything other than that, unless coupled with research on the human brain to prove we do something different.
You assume humans do the opposite? We literally institutionalize humans who not follow set patterns.
We also reward people who can memorize and regurgitate even if they don't understand what they are doing.