this post was submitted on 15 May 2024
513 points (97.4% liked)

Technology

60073 readers
2973 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://sopuli.xyz/post/12670977

iPhone owners say the latest iOS update is resurfacing deleted nudes

you are viewing a single comment's thread
view the rest of the comments
[–] rimjob_rainer@discuss.tchncs.de 41 points 7 months ago (1 children)

The former would be hilarious, it would mean that iOS explicitly classified those images as nudes.

[–] StaySquared@lemmy.world 11 points 7 months ago (2 children)

Indeed. But Apple does have the tech to analyze images/videos:

Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups.

[–] answersplease77@lemmy.world 6 points 7 months ago (1 children)

which means they exported this task to some Indians overaeas.. fuck which is just worse

ok so probably not, CSAM detection, specifically modern detection the kind that MS does, is based on image hashes, and how it works is that the law collects and creates the hash sets for these images, and distributes them to tech companies, who can then use them to calculate against hashes of existing photos, and if a match returns, ladies and gentleman, we got em.

[–] dojan@lemmy.world 2 points 7 months ago

It's using hashes, no?