this post was submitted on 16 Oct 2023
23 points (61.9% liked)

Technology

34889 readers
315 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SirGolan@lemmy.sdf.org 1 points 1 year ago* (last edited 1 year ago) (1 children)

Care to provide some proof of that? They did update their system prompt to include a few things like it is now GPT4 (it used to always say GPT3). Other than that, I don't think it knows anything. But in general, I was more talking about developments in AI since it was trained which it certainly does not know.

Edit: hmm I just reviewed our discussion and I note you only provided one link which was to the psychological definition of intelligence. You otherwise are providing no sources to back up your claims while my responses are full of them. Please start backing up your assertions, or provide some evidence you are an expert in the field.

[–] Veraticus@lib.lgbt 1 points 1 year ago (1 children)
[–] SirGolan@lemmy.sdf.org 1 points 1 year ago (1 children)

I'm aware of that date.

The OpenAI GPT-4 video literally states that GPT-4 finished training in August 2022.

Either way, to clarify / reiterate, you're refuting a different point than I've made. I said:

Its understanding of AI is from before it was trained, so it is wildly out of date at this point given how much has happened in the space since.

I'm not talking about whether it knows about its own training (I doubt that it does). I'm talking about it knowing about what's happened in the broader AI landscape since.

[–] Veraticus@lib.lgbt 1 points 1 year ago (1 children)

I mean, your argument is still basically that it's thinking inside there; everything I've said is germane to that point, including what GPT4 itself has said.

[–] SirGolan@lemmy.sdf.org 1 points 1 year ago

I mean, your argument is still basically that it’s thinking inside there; everything I’ve said is germane to that point, including what GPT4 itself has said.

My argument?

That doesn’t mean they’re having thoughts in there I mean. Not in the way we do, and not with any agency, but I hadn’t argued either way on thoughts because I don’t know the answer to that.

Are you assuming I’m saying that LLMs are sentient, conscious, have thoughts or similar? I’m not. Jury’s out on the thought thing, but I certainly don’t believe the other two things.

I'm not saying it's thinking or has thoughts. I'm saying I don't know the answer to that, but if it is it definitely isn't anything like human thoughts.