Ai cp, they found AI generated cp that had been generated on their service...
Explicit fakes makes it sound less bad.
They were allowing AI cp to be made.
This is a most excellent place for technology news and articles.
Ai cp, they found AI generated cp that had been generated on their service...
Explicit fakes makes it sound less bad.
They were allowing AI cp to be made.
This is the type of shit that radicalizes me against generative AI. It's done so much more harm than good.
The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.
It just should never be normalized or exist.
Probably got all the data to train for it from the pentagon. They're known for having tons of it and a lot of their staff (more than 25%) are used to seeing it frequently.
Easily searchable, though I don't like to search for that shit, but here's 1 post if you literally add pentagon to c____ p___ in a search a million articles on DIFFERENT subjects (than this house bill) come up https://thehill.com/policy/cybersecurity/451383-house-bill-aims-to-stop-use-of-pentagon-networks-for-sharing-child/
Anything like that involving children or child like individuals is a hard fucking no from me. It's like those mfs who have art of a little anime girl and go "actually shes a 5000 year old vampire." They know exactly what the fuck they're doing. I also hate the argument of "it's not real" like mf the sentiment is still there.
Just for what it's worth, you don't need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.
The fact that you don't need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It's gross, but it's also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won't allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it's a fake, made-for-film production and that nobody involved had their consent violated, so it's okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?
I don't know how I feel about it, myself. The idea of "ethically-sourced" CSAM doesn't exactly sit right with me, but if it's possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don't like it.
Its a very difficult subject, both sides have merit. I can see the "CSAM created without abuse could be used in treatment/management of people with these horrible urges" but I can also see "Allowing people to create CSAM could normalise it and lead to more actual abuse".
Sadly its incredibly difficult for academics to study this subject and see which of those two is more prevalent.
There is also the angle of generated CSAM looking real adding difficulty in prosecuting real CSAM producers.
I’d say that’s more of an AI industry issue than anything else. All AI art needs to be easily identifiable and sourced as such, but I doubt AI producers will want to hide tags on all their AI generated work though.
Nuanced take coming, take a breath:
I agree that Child Sexual Abuse is a horrible practice along with all other violence and oppression, sexual or not. But the attraction de facto exists and has done for thousands of years, even through intense taboos. It seems our current strategy of shaming and ignoring it has been ineffective. The definition of insanity being repeating the same thing expecting different results and all that.
Short of eugenics (and from previous trials maybe not even then) we might not be able to get rid of it.
So when do we try other ways of dealing with it?
I'm not saying generative AI is the solution, but I'm pretty sure denying harder isn't it.
I've been kind of on the fence about this but then research found that people who physically or verbally express their anger tend to get angrier or delay calming down, contrary to conventional wisdom. I wonder if there could be a similar pattern with this so now I'm hesitating.
Is “CP” so you don’t get flagged, or is it for sensitivity.
I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.
FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.
You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.
Very true, thanks for your sensitivity @dumbass
and it's wrong, too. it's not pornography, it's rape.