this post was submitted on 27 Jan 2024
117 points (84.6% liked)

Technology

59223 readers
3341 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Microsoft CEO calls for tech industry to 'act' after AI photos of Taylor Swift circulate X::Satya Nadella spoke to Lester Holt about artificial intelligence and its ability to create deepfake images of others. After pictures of Taylor Swift circulated, he called for actions

you are viewing a single comment's thread
view the rest of the comments
[โ€“] andrewrgross@slrpnk.net 20 points 9 months ago* (last edited 9 months ago) (1 children)

I think everyone's still trying to get some handle on this, but it seems like it's mostly an issue with scale.

People have been making and sharing photo manips of celebrities naked for about 20 years. This is just noteworthy because there's so much of it, so quickly produced, so publicly present.

I think we should all have the right to create images -- including sexual ones -- freely. And I think the subjects of those images (especially Emma Watson and Natalie Portman, I don't think anyone has been photoshopped into porn as much as those two) deserve to live their lives without having to see it or thick about it.

I don't think it's necessary a problem that has an easy legal solution, but a good start might be just recognizing this social contact. If you want to fantasize about anyone -- whether it's a celebrity or someone you take classes with -- understand how gross it is for them to have two know about it, and keep it discrete.

[โ€“] fidodo@lemmy.world 4 points 9 months ago

Not just scale but also accessibility. Now anyone can make these without having a specialized skill.

I don't think any laws targeting deep fakes should treat them as prohibited material, that would be invasive on freedom of speech and privacy rights for something like possession to be made illegal.

Instead it should be treated as harassment. At a bare minimum we should make the situation where someone targets someone they know and distributes pictures of them to other people they know illegal. For example, if someone were to create and distribute deepfake porn pictures of a classmate to other classmates, that situation should not be allowed to happen.

When it comes to a person of note where it's happening more in the background I feel that's more of a grey area since it's not necessarily a direct target since the person neither knows them nor their social circle, but I think there's a big difference if it's being distributed on a social media platform that the celebrity is on or discussed, vs a porn site.