this post was submitted on 24 Mar 2024
101 points (93.2% liked)

Asklemmy

43796 readers
807 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] tetris11@lemmy.ml 5 points 7 months ago (1 children)

When I read the Lambda transcripts of that Google employee (priest?) who tried to whistleblow that that Google's AI was sentient... I mean damn, I read those transcripts, it sounded real as hell.

Stochastic parrot or not, I think a significant part of my own consciousness goes towards predicting the next word in a given context.

[โ€“] HelixDab2@lemm.ee 11 points 7 months ago (1 children)

IIRC, consciousness is really, really complicated. We might not be capable of knowing if we are LLMs in meatsuits. An LLM might be a highly vocal infant, and we simply wouldn't have a great way of really making that judgement. Shit, we still can't define consciousness in humans--or other animals--in any meaningful way.

[โ€“] tetris11@lemmy.ml 2 points 7 months ago (1 children)

Agreed. I think it's a spectrum, and even a chair (an object that forms a feedback loop of forces with its surroundings that depends on its previous state) is conscious to a degree.

I don't think there is a line

[โ€“] femtech@midwest.social 2 points 7 months ago

There will be a legal line at some point.