this post was submitted on 01 Apr 2025
197 points (99.0% liked)

Technology

68304 readers
5542 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] dumbass@leminal.space 137 points 3 days ago (2 children)

Ai cp, they found AI generated cp that had been generated on their service...

Explicit fakes makes it sound less bad.

They were allowing AI cp to be made.

[–] venusaur@lemmy.world 13 points 3 days ago (1 children)

Is “CP” so you don’t get flagged, or is it for sensitivity.

[–] dumbass@leminal.space 35 points 3 days ago (3 children)

I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

[–] fushuan@lemm.ee 47 points 3 days ago* (last edited 3 days ago)

FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.

You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.

[–] kippinitreal@lemmy.world 19 points 3 days ago (1 children)

Very true, thanks for your sensitivity @dumbass

[–] Flocklesscrow@lemm.ee 9 points 3 days ago (4 children)

It's pronounced "doo mah."

[–] kippinitreal@lemmy.world 5 points 3 days ago* (last edited 3 days ago)

Wow so its from the duh region in france, here I thought it was just sparkling dumbass

load more comments (3 replies)
load more comments (1 replies)
[–] carrion0409@lemm.ee 10 points 3 days ago (1 children)

This is the type of shit that radicalizes me against generative AI. It's done so much more harm than good.

[–] BaseModelHuman@lemmy.world 8 points 3 days ago (4 children)

The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

It just should never be normalized or exist.

[–] Chozo@fedia.io 27 points 3 days ago (3 children)

Just for what it's worth, you don't need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.

The fact that you don't need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It's gross, but it's also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won't allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it's a fake, made-for-film production and that nobody involved had their consent violated, so it's okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?

I don't know how I feel about it, myself. The idea of "ethically-sourced" CSAM doesn't exactly sit right with me, but if it's possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don't like it.

[–] Womble@lemmy.world 15 points 3 days ago

Its a very difficult subject, both sides have merit. I can see the "CSAM created without abuse could be used in treatment/management of people with these horrible urges" but I can also see "Allowing people to create CSAM could normalise it and lead to more actual abuse".

Sadly its incredibly difficult for academics to study this subject and see which of those two is more prevalent.

[–] CptBread@lemmy.world 5 points 3 days ago (2 children)

There is also the angle of generated CSAM looking real adding difficulty in prosecuting real CSAM producers.

[–] MagicShel@lemmy.zip 5 points 3 days ago

This, above any other reason, is why I'm most troubled with AI CSAM. I don't care what anyone gets off to if no one is harmed, but the fact that real CSAM could be created and be indistinguishable from AI created, is a real harm.

And I instinctively ask, who would bother producing it for real when AI is cheap and harmless? But people produce it for reasons other than money and there are places in this world where a child's life is probably less valuable than the electricity used to create images.

I fundamentally think AI should be completely uncensored. Because I think censorship limits and harms uses for it that might otherwise be good. I think if 12 year old me could've had an AI show me where the clitoris is on a girl or what the fuck a hymen looks like, or answer questions about my own body, I think I would've had a lot less confusion and uncertainty in my burgeoning sexuality. Maybe I'd have had less curiosity about what my classmates looked like under their clothes, leading to questionable decisions on my part.

I can find a million arguments why AI shouldn't be censored. Like, do you know ChatGPT can be convinced to describe vaginal and oral sex in a romantic fiction is fine, but if it's anal sex, it has a much higher refusal rate? Is that subtle anti-gay encoding in the training data? It also struggles with polyamory when it's two men and a woman but less when it's two women and a man. What's the long-term impact when these biases are built into everyday tools? These are concerns I consider all the time.

But at the end of the day, the idea that there are children out there being abused and consumed and no one will even look for them because "it's probably just AI" isn't something I can bear no matter how firm my convictions are about uncensored AI. It's something I struggle to reconcile.

load more comments (1 replies)
load more comments (1 replies)
[–] Brainsploosh@lemmy.world 12 points 3 days ago* (last edited 3 days ago) (2 children)

Nuanced take coming, take a breath:

I agree that Child Sexual Abuse is a horrible practice along with all other violence and oppression, sexual or not. But the attraction de facto exists and has done for thousands of years, even through intense taboos. It seems our current strategy of shaming and ignoring it has been ineffective. The definition of insanity being repeating the same thing expecting different results and all that.

Short of eugenics (and from previous trials maybe not even then) we might not be able to get rid of it.

So when do we try other ways of dealing with it?

I'm not saying generative AI is the solution, but I'm pretty sure denying harder isn't it.

load more comments (2 replies)
[–] carrion0409@lemm.ee 6 points 3 days ago* (last edited 3 days ago)

Anything like that involving children or child like individuals is a hard fucking no from me. It's like those mfs who have art of a little anime girl and go "actually shes a 5000 year old vampire." They know exactly what the fuck they're doing. I also hate the argument of "it's not real" like mf the sentiment is still there.

[–] rice@lemmy.org 5 points 3 days ago (1 children)

Probably got all the data to train for it from the pentagon. They're known for having tons of it and a lot of their staff (more than 25%) are used to seeing it frequently.

Easily searchable, though I don't like to search for that shit, but here's 1 post if you literally add pentagon to c____ p___ in a search a million articles on DIFFERENT subjects (than this house bill) come up https://thehill.com/policy/cybersecurity/451383-house-bill-aims-to-stop-use-of-pentagon-networks-for-sharing-child/

[–] swelter_spark@reddthat.com 2 points 3 days ago

When my dad worked for the DoD, he was assigned a laptop for work that had explicit photos of children on it.

[–] Mohamed@lemmy.ca 27 points 3 days ago (4 children)

Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.

While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.

[–] jacksilver@lemmy.world 10 points 3 days ago

The biggest issue with this line of thinking is, how do you prove it's CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn't CP, but can't find the link right now).

So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.

[–] swelter_spark@reddthat.com 5 points 3 days ago (1 children)

AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.

[–] AnonomousWolf@lemm.ee 1 points 2 days ago

I think it would boost the market for the real thing more.

It's possible that there are people that would become into AI generated CP if it was just allowed to be advertised on nsfw website.

And that would lead some to seek out the real thing. I think it's best to condemn it entirely

[–] Ilovethebomb@lemm.ee 4 points 2 days ago (1 children)

What the fuck is AI being trained on to produce the stuff?

[–] JohnEdwa@sopuli.xyz 14 points 2 days ago (1 children)

Pictures of clothed children and naked adults.

Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.

[–] Ilovethebomb@lemm.ee 6 points 2 days ago

Well, that's somewhat reassuring.

Still reprehensible that it's being used that way, of course.

load more comments (1 replies)
load more comments
view more: next ›