this post was submitted on 20 Jun 2024
87 points (98.9% liked)

World News

39032 readers
2115 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
top 20 comments
sorted by: hot top controversial new old
[–] Chainweasel@lemmy.world 27 points 5 months ago (2 children)

While I agree the spread of CSAM needs to be stopped, we can't violate everyone's privacy in the effort to do it.
And are actual people going to be reviewing these messages or are we just going to blindly trust AI not to flag innocent people on accident?

[–] mox@lemmy.sdf.org 16 points 5 months ago* (last edited 5 months ago) (1 children)

Moreover, there is no way to implement scanning for {something-bad} in our encrypted communications that will not be abused or put people's safety at risk.

We need lawmakers to understand this.

Edit: And we need to hold those who don't respect it accountable for their acts of aggression.

[–] maynarkh@feddit.nl 12 points 5 months ago (1 children)

Oh, they do, they already exempted themselves and the police.

[–] qevlarr@lemmy.world 5 points 5 months ago (1 children)
[–] Noodle07@lemmy.world 2 points 5 months ago

I got pictures of my nieces and nephew naked taking their bath... Am I going to jail?

[–] Balinares@pawb.social 7 points 5 months ago

I mean, client-side scanning would do shit all to help anyway because nothing prevents culprits from encrypting the messages in a different app they are using for sending. This is just entirely a net loss -- right as fash are poised to take over too. What a mess.

[–] Treczoks@lemmy.world 14 points 5 months ago (2 children)

Looking for CSAM is only the excuse they put out. Who knows what they are actually after. Maybe they just want to find compromat on people.

[–] DoucheBagMcSwag@lemmy.dbzer0.com 6 points 5 months ago* (last edited 5 months ago) (1 children)

Its purpose is to quash dissent when the inevitable authoritarian police state comes.

They won't let another French revolution happen again

[–] Noodle07@lemmy.world 4 points 5 months ago

Imagine trying to prevent the French to protest

[–] FarraigePlaisteach@lemmy.world 3 points 5 months ago (1 children)

Well, we can't trust that corruption would be low enough that this wouldn't be exploited for other reasons. Child trafficking and exploitation is off the charts. If this did actually help those kids I'd consider supporting it. But I do remember when Apple announced that they would do this that some groups that seemed reputable claimed that it wouldn't actually help. I don't know what to think of it.

[–] sandbox@lemmy.world 5 points 5 months ago

99.9% of CSAM sharing etc. happens on the dark web forums, not on like, whatsapp or signal. So this legislation wouldn’t help prevent or catch those people

[–] sunzu@kbin.run 5 points 5 months ago (1 children)

These are the same people who allow Catholic clergy to fuck kids... to this day!

[–] Obonga@feddit.de 2 points 5 months ago* (last edited 5 months ago)

Ohh wait. It is not about the children? Colour me surprised. It is nearly as if it is used as a pretense to strip away our rights 🤔. Just wait for the next big terrorist attack. Then we will get our rights taken away under another pretense 👍

As long as there are powerful people, they will always use their power to life a live of "rules for thee but not for me".

[–] MSids@lemmy.world 2 points 5 months ago (3 children)

I was interested in Apple's approach where they would look at checksums of the images to see if they matched checksums of known CSAM. Its trivial to defeat by changing even a single pixel, but it's the only acceptable way to implement this scanning. Any other method is an overreach and a huge invasion of privacy.

[–] mox@lemmy.sdf.org 10 points 5 months ago* (last edited 5 months ago) (1 children)

Its trivial to defeat

Maybe, depending on the algorithm used. Some are designed to produce the same output given similar inputs.

It's also easy to abuse systems like that in order to get someone falsely flagged, by generating a file with the same checksum as known CSAM.

It's also easy for someone in power (or with the right access) to add checksums of anything they don't like, such as documents associated with opposing political or religious views.

In other words, still invasive and dangerous.

More thoughts here: https://www.eff.org/deeplinks/2019/11/why-adding-client-side-scanning-breaks-end-end-encryption

[–] MSids@lemmy.world 2 points 5 months ago* (last edited 5 months ago) (2 children)

Checksums wouldnt work well for their purposes if they could easily be made to match any desired checksum. It's one way math.

[–] mox@lemmy.sdf.org 3 points 5 months ago* (last edited 5 months ago)

One-way math doesn't preclude finding a collision.

(And just to be clear, checksum in the context of this conversation is a generic term that includes cryptographic hashes and perceptual hashes.)

Also, since we're talking about a list of checksums, an attacker wouldn't even have to find a collision with a specific one to get someone in trouble. This makes an attack far easier. See also: the birthday problem.

[–] piecat@lemmy.world 2 points 5 months ago

Checksums, on the other hand, are designed to minimize the probability of collisions between similar inputs, without regard for collisions between very different inputs.[8] Instances where bad actors attempt to create or find hash collisions are known as collision attacks.[9]

https://en.m.wikipedia.org/wiki/Hash_collision#:~:text=Checksums%2C%20on%20the%20other%20hand,are%20known%20as%20collision%20attacks.

[–] ocassionallyaduck@lemmy.world 9 points 5 months ago

Even this method is overreach: who control the database?

Journalist have a scoop on a US violation of civil rights? Well not if it is important to the CIA who slipped the PDF that was their evidence into the hash pool and had his phone silently rat him out as the one reporting.

This hands ungodly power to those running that database. It's blind, and it "only flags the bad things". Which we all agree CSAM is bad, but I can easily ruin someone inconvenient to me if I was in that position by just ensuring some of his personal and unique photo get into the hash. It's a one way process, so everyone would just believe definitively that this radical MLK guy is a horrible pedo because we got some images off his phone in a diner.

[–] douglasg14b@lemmy.world 1 points 5 months ago* (last edited 5 months ago)

It's not as easy to defeat as just changing the pixel....

CSAM detection often uses existing features for image matching such as PhotoDNA by Microsoft. Similarly both Facebook and Google also have image matching algorithms and software that is used for CSAM detection which.

These are all hash based image matching tools used for broad feature sets such as reverse image search in bing, and are not defeated by simply changing a pixel. Or even redrawing parts of the whole image itself.

You're not just throwing an md5 or an sha at an images binary. It's much more nuanced and complex than that, otherwise hash based image matching would be essentially useless for anything of consequence.