this post was submitted on 25 Mar 2025
3 points (100.0% liked)

technology

23695 readers
320 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] shath@hexbear.net 2 points 1 month ago
[–] uglyface@hexbear.net 1 points 1 month ago

I use it to dump logs into so I can spend more time on hexbear at work and less time doing any troubleshooting I'm being paid for. I hope they log and steal all my employers data

[–] GaveUp@hexbear.net 1 points 1 month ago

As others have pointed out, it's a pretty poor article devoid of any substance and not the most sound conclusions

Read the actually study summary https://openai.com/index/affective-use-study/

[–] CrawlMarks@hexbear.net 1 points 4 weeks ago

I had toncomefort a friend of mine. She recently lost a friend. That is, chat GPT updated the model it was using and her instance got deleted with all the conversations she had with it. She was distraught. It probably was the deepest She had been able to speek to anyone in a while as she tends to isolate a bit. I am not sure if this is a good or bad trend but she does seem more well adjusted with getting her social needs met in a way she can manage.

[–] SaniFlush@hexbear.net 1 points 1 month ago

Monkey here, I’m still not sure if I prefer the metal wire mother or the cloth mother

[–] peppersky@hexbear.net 1 points 1 month ago

How many people in real life are you consistently and repeatedly talking about actual things about, "bouncing ideas off" or actually engaging with the world with? Can't be that many, I don't find it at all strange that people would get attached to an AI chat bot that does these things

[–] CarbonScored@hexbear.net 1 points 1 month ago* (last edited 1 month ago)

This is the wrong conclusion. This looks a lot less "Something Bizarre Is Happening to People Who Use ChatGPT" and more like "Something Bizarre Already Happened to People Who Use ChatGPT a Lot". Like yeah no shit, lonely people will reach for whatever company they can get, however shit it is.

[–] hungrybread@hexbear.net 1 points 1 month ago

the neediest people are developing the deepest parasocial relationship with AI

It's not parasocial, it's a chat bot.

[–] Evilphd666@hexbear.net 1 points 1 month ago
[–] nothx@hexbear.net 1 points 1 month ago (1 children)

Though the vast majority of people surveyed didn't engage emotionally with ChatGPT, those who used the chatbot for longer periods of time seemed to start considering it to be a "friend." The survey participants who chatted with ChatGPT the longest tended to be lonelier and get more stressed out over subtle changes in the model's behavior, too.

Holy shit, we have come into the era of digital waifus.

Don’t get me wrong, I think that loneliness in our modern society is endemic and dangerous, but I can’t help but chuckle about having parasocial relationships with a chatbot…

[–] barrbaric@hexbear.net 1 points 1 month ago

Holy shit, we have come into the era of digital waifus.

Replika came out in 2017, we've already been here for a while.

[–] dragongloss@hexbear.net 1 points 1 month ago

i'm getting married to deepseek tomorrow, y'all should come

[–] gingerbrat@hexbear.net 0 points 1 month ago (2 children)

This is so disturbing. Like, what does it tell us about our society (nothing new, I know) if people are so lonely that they form parasocial attachments to a shitty chatbot?

[–] nothx@hexbear.net 1 points 1 month ago* (last edited 1 month ago)

It’s really sad… not surprising at all considering the current sociopolitical climate. In the past these people were finding niche online communities of likeminded people, which at least promoted a more healthy form of socializing. This new shift seems super dangerous and scary tho…

[–] KobaCumTribute@hexbear.net 1 points 1 month ago

It really doesn't seem all that surprising: people form attachments like that to basically anything at the drop of a hat. It usually happens to things that move around on their own like animals or mobile robots, but it also gets done to tools and simple machines, to plants, to places, even to entirely fictional constructs that exist only within an abstract space in their head. Broadly speaking, people project spirits into things (so to speak) as something like a facet of how empathy works.

Like people become emotionally attached to robots that only move when they personally operate a control panel to move them and become distraught if those unemotive and unresponsive machines are damaged or destroyed. Now make the machine talk back in a somewhat coherent way that mimics actual interaction and life and it's going to kick that phenomenon into overdrive. If you attach a synthetic face to it it'll happen even harder.