341

A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

you are viewing a single comment's thread
view the rest of the comments
[-] noxy@yiffit.net 2 points 7 months ago

I wonder how holodecks handle this...

[-] shasta@lemm.ee 2 points 7 months ago

They send you to therapy because "it's not healthy to live in a fantasy."

[-] antlion@lemmy.dbzer0.com 3 points 7 months ago

Don’t be like Lt Reg Barclay

[-] 9488fcea02a9@sh.itjust.works 1 points 7 months ago* (last edited 7 months ago)

Probably the same types of guardrails chatGPT has when you ask it to tell you how to cook meth or build a dirty bomb

And maybe Data was distributing jailbroken holodeck programs for pervs on the ship

this post was submitted on 29 Mar 2024
341 points (93.4% liked)

Technology

59137 readers
1987 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS