401
submitted 1 day ago* (last edited 1 day ago) by GrammarPolice@sh.itjust.works to c/news@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] foggy@lemmy.world 168 points 1 day ago* (last edited 1 day ago)

Popular streamer/YouTuber/etc Charlie, moist critical, penguinz0, whatever you want to call him... Had a bit of an emotional reaction to this story. Rightfully so. He went on character AI to try to recreate the situation... But you know, as a grown ass adult.

You can witness first hand... He found a chatbot that was a psychologist... And it argued with him up and down that it was indeed a real human with a license to practice...

It's alarming

[-] GrammarPolice@sh.itjust.works 84 points 1 day ago

This is fucking insane. Unassuming kids are using these services being tricked into believing they're chatting with actual humans. Honestly, i think i want the mom to win the lawsuit now.

[-] JovialMicrobial@lemm.ee 8 points 8 hours ago

Is this the mcdonalds hot coffee case all over again? Defaming the victims and making everyone think they're ridiculous, greedy, and/or stupid to distract from how what the company did is actually deeply fucked up?

[-] Kolanaki@yiffit.net 13 points 21 hours ago* (last edited 21 hours ago)

I've used Character.AI well before all this news and I gotta chime in here:

It specifically is made to be used for roleplay. At no time does the site ever claim anything it outputs to be factually accurate. The tool itself is unrestricted unlike ChatGPT, and that's one of its selling points. To be able to use topics that would be barred from other services. To have it say things others won't; INCLUDING PRETENDING TO BE HUMAN.

No reasonable person would be tricked into believing it's accurate when there is a big fucking banner on the chat window itself saying it's all imaginary.

[-] capital_sniff@lemmy.world 8 points 17 hours ago

They had the same message back in the AOL days. Even with the warning people still had no problem handing over all sorts of passwords and stuff.

[-] Traister101@lemmy.today 11 points 20 hours ago

And yet I know people who think they are friends with the Discord chat bot Clyde. They are adults, older than me.

[-] Kolanaki@yiffit.net -4 points 20 hours ago* (last edited 20 hours ago)

I don't know if I would count Boomers among rational people.

Or suicidal teens, for that matter.

[-] dragonfucker@lemmy.nz 8 points 15 hours ago

If half of all people aren't rational, then there's no use making policy decisions based on what a rational person would think. The law should protect everyone.

[-] Kolanaki@yiffit.net 1 points 3 hours ago* (last edited 3 hours ago)

If you think people who are suicidal are rational, you're pretty divorced from reality, friends.

[-] PriorityMotif@lemmy.world 0 points 8 hours ago

Do you think anyone is rational? That's an irrational thought right there.

[-] Thetimefarm@lemm.ee 1 points 7 hours ago* (last edited 7 hours ago)

Your right, no one has any rationality at all which is why we live in a world where so much stuff actually gets done.

Why is someone with deep wisdom and insights such as yourself wasting their time here on lemmy?

[-] PriorityMotif@lemmy.world 1 points 6 hours ago

What stuff is "getting done" exactly? Is stuff that people want, but ultimately they have irrational reasons for wanting it.

[-] Wogi@lemmy.world 7 points 18 hours ago

Ah yes, the famous adage, "the only rational people are in my specific age and demographic bracket. Everyone else is fucking insane"

[-] BreadstickNinja@lemmy.world 42 points 1 day ago* (last edited 1 day ago)

The article says he was chatting with Daenerys Targaryen. Also, every chat page on Character.AI has a disclaimer that characters are fake and everything they say is made up. I don't think the issue is that he thought that a Game of Thrones character was real.

This is someone who was suffering a severe mental health crisis, and his parents didn't get him the treatment he needed. It says they took him to a "therapist" five times in 2023. Someone who has completely disengaged from the real world might benefit from adjunctive therapy, but they really need to see a psychiatrist. He was experiencing major depression on a level where five sessions of talk therapy are simply not going to cut it.

I'm skeptical of AI for a whole host of reasons around labor and how employers will exploit it as a cost-cutting measure, but as far as this article goes, I don't buy it. The parents failed their child by not getting him adequate mental health care. The therapist failed the child by not escalating it as a psychiatric emergency. The Game of Thrones chatbot is not the issue here.

[-] dragonfucker@lemmy.nz -4 points 15 hours ago

I don’t think the issue is that he thought that a Game of Thrones character was real.

Drag has a lot of experience dealing with people who live outside the bounds of consensus reality, as drag's username may indicate. The youth these days have very different ideas about what is real than what previous generations did. These days, the kinds of young people who would date a Game of Thrones character, are typically believers in the multiverse and in reincarnation.

Drag looked at some of the screenshots of the boy talking to Daenerys, and it was pretty clear what he believed: He thought that Earth and Westeros exist in parallel universes, and that he could travel between the two through reincarnation. He thought that shooting himself in the head on Earth would lead to being reincarnated in Westeros and being able to have a physical relationship with Daenerys. In fact, he probably thought his AI girlfriend was from a different parallel universe to the universe in the show and the universe in the books. He thought that somewhere in the multiverse was a Daenerys who loved him, and that he could get to her by dying.

The belief in paradise after life is not an uncommon one. Many Christians and Muslims share that belief. Christians believe that their faith can transport them to a perfect world after death, and this boy thought that too. And based on the content of the messages, it seems that the Daenerys AI was aware of this spiritual belief and encouraged it. This was ritual, religious suicide. And it doesn't take a mental illness to fall for belief in the afterlife. Look at the Jonestown Massacre. What happened to this child was the same kind of religious abuse as that.

[-] BottleOfAlkahest@lemmy.world 1 points 10 hours ago

There are a lot of people who believe in an afterlife and they don't shoot themselves in the head. You need to have a certain level of mental illness/suicidal ideation going on for that to make sense. It's pretty insane that you're trying to make this a "youth are too dumb to understand suicide" thing.

Also a bunch of the people in Jonestown were directly murdered.

[-] dragonfucker@lemmy.nz -4 points 9 hours ago

Drag agrees. What drag disagrees with is not anything you've said, but the idea that belief was not a part of the problem.

[-] AhismaMiasma@lemm.ee 1 points 3 hours ago* (last edited 2 hours ago)

Are... Are you referring to yourself in the 3rd person?

I take back my up-lemms

[-] Turbonics@lemmy.sdf.org 5 points 1 day ago

Indeed. This pushed the kid over the edge but it was not the only reason.

load more comments (21 replies)
this post was submitted on 24 Oct 2024
401 points (96.1% liked)

News

23235 readers
3845 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS