this post was submitted on 04 Apr 2025
227 points (94.5% liked)

Technology

72783 readers
2956 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Nucleo's investigation identified accounts with thousands of followers with illegal behavior that Meta's security systems were unable to identify; after contact, the company acknowledged the problem and removed the accounts

you are viewing a single comment's thread
view the rest of the comments
[–] FauxLiving@lemmy.world 8 points 3 months ago (16 children)

Child Sexual Abuse Material is abhorrent because children were literally abused to create it.

AI generated content, though disgusting, is not even remotely on the same level.

The moral panic around AI that leads to implying that these things are the same thing is absurd.

Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.

Don't dilute the horror of the production CSAM by equating it to fake pictures.

[–] Grimy@lemmy.world -2 points 3 months ago* (last edited 3 months ago) (11 children)

Although that's true, such material can easily be used to groom children which is where I think the real danger lies.

I really wish they had excluded children in the datasets.

You can't really put a stop to it anymore but I don't think it should be something that's normalized and accepted just because there isn't a direct victim anymore. We are also talking about distribution here and not something being done in private at home.

[–] Cryophilia@lemmy.world 1 points 3 months ago (10 children)

such material can easily be used to groom children

This literally makes no sense.

load more comments (8 replies)
load more comments (8 replies)
load more comments (12 replies)