this post was submitted on 13 Sep 2024
39 points (93.3% liked)

Technology

1345 readers
123 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution

founded 11 months ago
MODERATORS
 

1984 meets Minority Report.

top 10 comments
sorted by: hot top controversial new old
[–] originalfrozenbanana@lemm.ee 33 points 1 month ago (1 children)

Guarantee it’s just racist

[–] UlyssesT@hexbear.net 8 points 1 month ago

Racists that coded their biases into it can say "it's just the algorithm!" while the racism machine does a racism.

[–] UlyssesT@hexbear.net 22 points 1 month ago* (last edited 1 month ago)

Oh boy, just like in the Minority Report treats! Bazinga!

[–] sharkfucker420@lemmy.ml 9 points 1 month ago

Psychopass vibes

[–] possiblylinux127@lemmy.zip 7 points 1 month ago (2 children)

We will have to see how this plays out in court. Somehow I think the court system will reject the idea of prejustice. I expect them to get sued before to long

[–] SpaceNoodle@lemmy.world 11 points 1 month ago

If all this ends up doing is flag dark-skinned people as guilty, the courts will love it.

[–] laurelraven@lemmy.zip 1 points 1 month ago

I mean, if someone gets arrested because a machine predicted they would shoplift but no actual shoplift had been attempted, I'd like to believe the case would be thrown out for lack of any crime having been committed, but I don't know if I have that much faith left in humanity anymore

[–] shalafi@lemmy.world 5 points 1 month ago* (last edited 1 month ago)

By giving law enforcement a heat map of the area they cover, they could see potential crime hotspots in real time through the predictive crime map (PCM). This allows for better allocation of police officers and for more effective coverage.

Oh Jesus Christ. As if beat cops need a map. I assure you, after a month or two on the streets, they know exactly where the bad spots are, down to the blocks and homes. Hell, I've seen cops on a first name basis with offenders and constant callers.

Imagine cops asking for this tech, admitting they're failing at one of the most basic parts of the job. "Really need this chief, we have no idea what's going on out there!" Or, "This could be an important tool Mr. Mayor. Our street cops are clueless." If the cops know anything at all, it's where they're going to face hassle.

The recidivism prediction is the evil shit here. Aren't judges already using such systems when sentencing and calculating parole? Some algo assigns numbers to X,Y and Z factors, pops out a score?

[–] nolannice@lemmy.world 5 points 1 month ago

They built the Minority Report from the hit film, "Don't build the Minority Report"!!

[–] Maeve@kbin.earth 3 points 1 month ago

Are we doing this again? 😲 😡