this post was submitted on 29 Aug 2024
34 points (100.0% liked)
Technology
37730 readers
266 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm gonna take the side that tok is potentially liable on the algo argument but these parents also failed their children. Teaching your kids to avoid replicating unsafe internet content should be just as primary as looking both ways before crossing the road.
"If your friend told you to jump off a bridge, would you?"
-Any decent parent
"Bridge jumping challenge"
As a society, we're responsible for all our children. The point of child protection laws, and population protection in general, is to support and protect them, because often times, parents are incapable of doing so, or it's social dynamics that most parents can't really understand, follow, or teach in.
Yes, parents should teach and protect their children. But we should also create an environment where that is possible, and where children of less fortunate and of less able parents are not victims of their environment.
I don't think demanding and requiring big social platforms to moderate and regulate at least to the degree where children are not regularly exposed to life-threatening trends is a bad idea.
That stuff can still be elsewhere if you want it. But social platforms have a social dynamic, more so than an informative one.