361
submitted 9 months ago* (last edited 9 months ago) by 7heo@lemmy.ml to c/privacy@lemmy.ml
top 50 comments
sorted by: hot top controversial new old
[-] LWD@lemm.ee 92 points 9 months ago

Based on that one Senate hearing, it looks like big companies like Facebook, Discord and Twitter are aiming for the maximum percent of false positives and false negatives when it comes to CSAM.

The only thing I know about that screenshot is that it used to say "show results anyway" which is probably worse in most cases

[-] taanegl@lemmy.world 38 points 9 months ago

"LOOK!! WE'RE ACTUALLY DOING SOMETHING!!! ALL OUR USERS ARE NOT PDF FILES!"

[-] Uranium3006@kbin.social 14 points 9 months ago

with any luck this will destroy them and funnel disgruntled users our way, where the servers are too numerous to ever fully take down and many aren't even US based anyways

[-] LWD@lemm.ee 17 points 9 months ago

Unfortunately, I don't think so. Most of the politicians were virtue signaling, asking questions that were impossible and demanding timetables that they weren't going to get anyway. One woman actually had some half decent data prepared, but I don't think anybody else was really taking it seriously.

Now if there was some legislation passed, specifically stuff that wasn't KOSA, that would be something else. KOSA seems prepped to simply destroy free speech on the internet, and it would mostly harm smaller social media networks that don't have lawyers and around-the-clock moderators to police every single comment and post.

[-] bionicjoey@lemmy.ca 67 points 9 months ago
[-] pipariturbiini@sopuli.xyz 51 points 9 months ago

16% is pretty good. the ones at three to one percent are the weirdos.

[-] bionicjoey@lemmy.ca 23 points 9 months ago

I really hate and avoid when my phone switches into battery saver at 15%, so in my mind 16% is like 1%

[-] Omega_Haxors@lemmy.ml 13 points 9 months ago

My skin crawls if it goes below 30%.

[-] where_am_i@sh.itjust.works 9 points 9 months ago

To prolong your battery's lifespan you shouldn't let it drain below 20%.

[-] VindictiveJudge@lemmy.world 7 points 9 months ago

Don't phone battery indicators lie to you now so that 0% displayed is actually about 20% specifically because of this?

load more comments (4 replies)
load more comments (5 replies)
[-] SheeEttin@programming.dev 7 points 9 months ago

You ever seen a phone at 0%?

load more comments (1 replies)
[-] Hildegarde@lemmy.world 29 points 9 months ago

Here's a hot tip. If you're on android, open the developer settings and turn on "demo mode" before taking screenshots. It makes the battery and signal display as 100% so you don't get judged by internet commenters who don't go outside.

[-] stratosfear@lemmy.sdf.org 8 points 9 months ago

Or you click that little edit button and crop the top of the image completely off.

load more comments (6 replies)
[-] AtariDump@lemmy.world 11 points 9 months ago
[-] bionicjoey@lemmy.ca 17 points 9 months ago

I literally used the embed link from XKCD's website

load more comments (6 replies)
[-] aldalire@lemmy.dbzer0.com 6 points 9 months ago

Ignorance is bliss

[-] Squizzy@lemmy.world 53 points 9 months ago

I reported loads of content on Instagram, genuinely creepy accounts of "athletic teens" and they all got rejected.

I got caught in a horrible recommendations loop because I'd like family photos of running and gymnastics for my nieces and cousins.

[-] Trainguyrom@reddthat.com 5 points 9 months ago

got caught in a horrible recommendations loop because I’d like family photos of running and gymnastics for my nieces and cousins.

I never reach that point on Facebook. I scroll for about 5 posts to see what my family and friends might be up to and get too frustrated with unmoderated spam and report it as spam and close the tab and move on

load more comments (9 replies)
[-] forgotmylastusername@lemmy.ml 46 points 9 months ago

One the biggest problems with the internet today is bad actors know how to manipulate or dodge the content moderation to avoid punitive consequences. The big social platforms are moderated by the most naive people in the world. It's either that or willful negligence. Has to be. There's just no way these tech bros who spent their lives deep in internet culture are so clueless about how to content moderate.

[-] blazeknave@lemmy.world 29 points 9 months ago

I know them. I worked in this industry. They're not naive. What basis do you have for these comments?

I think you're conflating with business executives running said social and gaming companies. Stop calling them techbros. Meta is not a tech startup. They're a transnational corporation. They have capitalist execs running the companies.

[-] KeenFlame@feddit.nu 11 points 9 months ago

Indie megacorp buying their first nation just a startup

load more comments (1 replies)
[-] Fudoshin@feddit.uk 16 points 9 months ago

bad actors know how to manipulate or dodge the content moderation to avoid punitive consequences.

People have been doing that since the dawn of the internet. People on my old forum in the 90s tried to circumvent profanity filters on phpBB.

Even now you can get round Lemmy.World filters against "fag-got" by adding a hyphen in it.

Nothing new under the sun.

[-] Jknaraa@lemmy.ml 8 points 9 months ago

The thing is that words can have a very broad range of meaning depending on who uses them and how (among many other factors), but you can't accurately code all of that into a form that computers can understand. Even ignoring bad actors it makes certain things very difficult, like if you ever want to search for something that just happens to share words with something completely different which is very popular.

[-] dRLY@lemmy.ml 4 points 9 months ago

Auto-moderation is both lazy and is only going to get worse. Not saying there isn't some value on things being hard-banned (like very specific spam like shit that just keeps responding to everything with the same thing non-stop). But these mega outlets/sites want to just use full automation to ban shit without any human interactions. At least unless you or another corp has connections on the inside to get a person or people to fix it. Just like how they make it so fucking hard to ever reach a person when calling (or trying to even find) a support line.

This automated shit just blacklists more and more shit and can completely fuck over people that use those sites for income (and they even can't reach a person when their income is cut off for false reasons and don't get back-pay for the period of a strike/ban). The bad guys will always just keep moving to a new word or phrase as the old ones get banned. So we as users are actually losing words and phrases and the actual shit is just on to the next one without issues.

load more comments (1 replies)
[-] phx@lemmy.ca 30 points 9 months ago

It's dumb, but it's also possible that a combination of those terms hads been adopted by some group distributing CSAM.

At one point, "cheese pizza" was a term they apparently used on YouTube videos etc due to it having the same abbreviation as CP (Child Pornography).

Sick fucks ruining everything for everyone

[-] dRLY@lemmy.ml 16 points 9 months ago

I agree with you is the TL;DR, and the rest is just my mad ranting opinions about companies being allowed to just auto-censor us. So feel free to completely ignore the rest. lol.

It is like just banning words and phrases just because bad people use them has just become the norm. I really really can't stand the way that channels on YT constantly have to self-censor basically everything (even if the video is just reporting on or trying to explain bad shit that is or has happened). And it never seems to actually stop the actual issues from happening. Just means the bad people just move on to a new word or phrase that is then itself banned. It isn't about actually stopping fucked-up shit from happening. It is just about making sure advertisers and other sources of money don't throw a fit.

We always hear about how places like China are bad in-part for censoring words and speech. But in the US and other western nations we pretend we are allowed to freely speak uncensored. We have always had censoring of speech, it is just that the real rulers of the country are allowed to do it instead. Keeps the government's hands free from legally being the enforcers of doing it to us. Shit like CP is fucked, and it should be handled for what it is, but allowing for-profit companies and especially their algorithms/AI to decide what we can and can't say or search for without any level of human interactions that very much lead to false bans is also fucked.

It is waaaay too easy for all the mega corps to completely take down channels and block creators from revenue of their own work just completely automated. But the accused channel can't ever get a real person to both get clear understanding of what and who is attacking them, and to explain why their strike/bans aren't valid. I have heard that even channels that have gotten written/legal permission from a big studio to use a clip of music or segment from video (music being the worst) will STILL catch automated strikes for copyright violations.

We don't need actual government censors, because the mega corps with all the money are allowed to do it for them. We have rights but they don't really matter if they can say a private company or org made up of people from various mega corps are allowed to do it for them.

load more comments (1 replies)
[-] Lath@kbin.social 30 points 9 months ago

That's what you get for all the teabagging you've been doing..

[-] IzzyScissor@lemmy.world 29 points 9 months ago

Remember, searching for "halo" is banned because it could potentially be linked to pedophilia, but editing a video of the president to look like a pedophile is fine because "it wasnt done with AI."

[-] AVincentInSpace@pawb.social 11 points 9 months ago

OOTL -- what happened there?

[-] DAMunzy@lemmy.dbzer0.com 13 points 9 months ago

Biden was edited to look like he was groping his granddaughter for an extended amount of time instead of quickly putting a pin above her breast. It was posted to Facebook/Instagram/Meta. AI wasn't used.

[-] Darkassassin07@lemmy.ca 19 points 9 months ago

You missed the part where Meta reviewed it and didn't remove it because it wasn't done with AI. Created manually so it's fine.

[-] DAMunzy@lemmy.dbzer0.com 4 points 9 months ago

True enough! Even worse

load more comments (1 replies)
[-] rawrthundercats@lemmy.ml 27 points 9 months ago

How do we know they didn't type something more explicit to get the result and just change what's in the search bar? Has anyone verified this?

[-] 7heo@lemmy.ml 38 points 9 months ago

I actually don't know, I'm not sure it is possible (I never used Instagram, the search might be auto-submitting for all I know) but intentionally flagging yourself as potential child abuser, for clout, is a bit extreme...

[-] yamanii@lemmy.world 13 points 9 months ago
[-] rawrthundercats@lemmy.ml 5 points 9 months ago

Nutty thank you for putting yourself on the list for science.

[-] baatliwala@lemmy.world 26 points 9 months ago

Barely 2 years ago I noticed that people were posting porn on Insta, and it was publicly visible just because they tagged #cum as #cΓΌm. I don't think this is possible now, but basically corporations are dumb and people posting disallowed content can be creative as hell.

[-] archonet@lemy.lol 49 points 9 months ago

ah yes, cumlauts

load more comments (1 replies)
[-] nicetriangle@kbin.social 25 points 9 months ago* (last edited 9 months ago)

I had a post of mine flagged for multiple days on there because it had an illustration of a woman in a full length wool coat completely covering her and not in any way sexual. Shit is so stupid

[-] AbouBenAdhem@lemmy.world 20 points 9 months ago

Well, the Spartans were pederasts...

[-] makeasnek@lemmy.ml 12 points 9 months ago* (last edited 9 months ago)

We beat KOSA before, we can beat it again. Contacting your reps matters. Voting matters, especially in primaries and locals. So does being active politically in other ways.

https://www.fightforthefuture.org/

[-] LucidBoi@lemmy.world 6 points 9 months ago

I'm not familiar with American stuff, what is KOSA?

[-] Astronautical@sh.itjust.works 7 points 9 months ago

It's the "Kids Online Safety Act". Basically it's using the old "think of the children!" move, but in reality conservatives are trying to push anything queer back into the dark.

[-] LucidBoi@lemmy.world 4 points 9 months ago

Think of the children! Let us scan all of your images, files and messages! For the sake of children of course! Nothing suspicious here...

load more comments (4 replies)
[-] KingThrillgore@lemmy.ml 8 points 9 months ago

The spartans were children at one point

load more comments
view more: next β€Ί
this post was submitted on 05 Feb 2024
361 points (96.4% liked)

Privacy

31809 readers
310 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS