this post was submitted on 22 May 2024
1420 points (99.4% liked)

Science Memes

11081 readers
2722 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
all 50 comments
sorted by: hot top controversial new old
[–] then_three_more@lemmy.world 52 points 6 months ago (5 children)

It would technically be the fifth law.

Zeroth Law - A robot may not injure humanity or, through inaction, allow humanity to come to harm.

[–] pruwybn@discuss.tchncs.de 19 points 5 months ago (1 children)

But if you're starting from zeroth it would be the fourth.

[–] olutukko@lemmy.world 8 points 5 months ago (1 children)

and with robots and computers it just makes sense to start with 0

[–] captainlezbian@lemmy.world 3 points 5 months ago* (last edited 5 months ago) (1 children)

It’s even better because

Tap for spoilerA robot created the zeroth law to allow the killing of people to save humanity

[–] HessiaNerd@lemmy.world 3 points 5 months ago (1 children)

Only in the shitty movie. Not in the books.

[–] captainlezbian@lemmy.world 2 points 5 months ago (1 children)

Was there a movie? Mind you it’s been like 15 years since I read robots and empire but

Tap for spoilerAllowing the earth to be radiation poisoned would kill people but force the humans off earth

Like I’d love some good robots movies. Robots of Dawn would likely struggle with reception, and honestly so would Under the Naked Sun but Caves of Steel? Less so.

[–] HessiaNerd@lemmy.world 1 points 5 months ago (1 children)

That is the plot of the Will Smith version of I, Robot.

If I remember correctly, it's actually Daneel who comes up with the zeroth law. And it's not to justify killing people.

https://en.m.wikipedia.org/wiki/R._Daneel_Olivaw

[–] captainlezbian@lemmy.world 1 points 5 months ago (1 children)

Why would anyone put will smith in this movie, or call it I, Robot, much less I have to assume they combined robots and empire with caves of steel and that’s a shit decision as well‽

[–] HessiaNerd@lemmy.world 1 points 5 months ago

They actually took a bunch of elements of the short story collection and jammed them together. The worst is what they did to Susan Calvin...

Ignoring the butchery, it's a pretty generic action movie. Very forgettable. Adding what they did to the source material makes it a straight tragedy.

[–] yamapikariya@lemmyfi.com 15 points 6 months ago (3 children)

May not injure you say. Can't be injured if you're dead. (P.S. I'm not a robot)

[–] prex@aussie.zone 18 points 5 months ago

Sounds like something a robot would say.

[–] samus12345@lemmy.world 8 points 5 months ago (1 children)

Pretty sure death qualifies as "harm".

[–] yamapikariya@lemmyfi.com 1 points 5 months ago* (last edited 5 months ago) (1 children)

The sentence says "...or, through inaction, allow humanity to come to harm." If they are dead due to the robots action it is technically within the rules.

[–] samus12345@lemmy.world 5 points 5 months ago (1 children)

Oh, I see, you're saying they can bypass "injure" and go straight to "kill". Killing someone still qualifies as injuring them - ever heard the term "fatally injured"? So no, it wouldn't be within the rules.

[–] MystikIncarnate@lemmy.ca 1 points 5 months ago (1 children)

I think he's referring to the absolutism of the programmatic "or" statement.

The robot would interpret (cannot cause harm to humanity) or (through inaction allow harm to come to humanity). If either statement is true, then the rule is satisfied.

By taking action in harming humans to death, the robot made true the second statement satisfying the rule as "followed".

While our meat brains can work out the meaning of the phrase, the computer would take it very literally and therefore, death to all humans!

Furthermore, if a human comes to harm, they may have violated the second half of the first rule, but since the robot didn't cause harm to the person, the first statement is true, therefore, death to all humans!

[–] samus12345@lemmy.world 2 points 5 months ago (1 children)

That works if you ignore the commas after "or" and "through inaction", which does sound like a robot thing to do. Damn synths!

[–] MystikIncarnate@lemmy.ca 1 points 5 months ago (1 children)

Programmatically, if you want it to do both, use "and"

[–] samus12345@lemmy.world 3 points 5 months ago* (last edited 5 months ago) (1 children)

"Nor" would be more grammatically correct and clearer in meaning, too, since they're actually telling robots what not to do.

[–] MystikIncarnate@lemmy.ca 2 points 5 months ago

In terms of English and grammar, you're not wrong.

[–] andrew_bidlaw@sh.itjust.works 5 points 5 months ago (1 children)

The concept of death may be hard to explain because robots don't need to run 24\7 in order to keep functioning. Until instructed otherwise,a machine would think a person with a cardiac arrest is safe to boot later.

[–] NABDad@lemmy.world 4 points 5 months ago (1 children)

Who can say that death is the injury? It could be that continued suffering would be an injury worse than death. Life is suffering. Death ends life. Therefore, death ends suffering and stops injury.

[–] andrew_bidlaw@sh.itjust.works 3 points 5 months ago* (last edited 5 months ago)

I mean, this logic sounds not unlike mister Smith from The Matrix.

'Why, mister Anderson' moment from The Matrix

[–] nicknonya@lemmy.blahaj.zone 6 points 5 months ago (1 children)

couldn't that be inferred from the first law?

[–] Mithre@lemmy.world 12 points 5 months ago (3 children)

Actually no! Lower numbered laws have priority over higher numbers, meaning that if they come into conflict the higher number law can be broken. While the first law says they can't allow humans to come to harm, the zeroth law basically says that if it's for the good of the species, they absolutely can kill or otherwise hurt individual humans.

[–] nicknonya@lemmy.blahaj.zone 6 points 5 months ago (1 children)

does that happen in the stories?

[–] Shimon@slrpnk.net 9 points 5 months ago

Yes! I think it is the second story in the book

[–] VindictiveJudge@lemmy.world 4 points 5 months ago

Law 0 is also a derived law rather than a programmed one. Robots with both the three laws and sufficient intelligence that are in a position where Law 1 becomes a catch 22 will tend to derive Law 0.

[–] HonoraryMancunian@lemmy.world 2 points 5 months ago (1 children)

Lower numbered laws have priority over higher numbers

That means this is the negative first law

[–] LucidBoi@lemmy.dbzer0.com 1 points 5 months ago

I just finished the book today 🥲

[–] lolcatnip@reddthat.com 49 points 5 months ago (5 children)

This just reminds me I'm mildly irritated that robots in fiction have glowing eyes so often. Light is supposed to go into eyes, not come out of them!

[–] wieson@feddit.de 28 points 5 months ago (1 children)

Robots or any part of an automated production line with a camera typically has a light as well to either see in low light conditions or to ensure it always sees with a similar amount of light hitting the lense.

[–] HessiaNerd@lemmy.world 4 points 5 months ago

Also, a lot of the machine vision systems I've run up against use red light, but it is kind of complex. If they want to detect say blood, I think blue light would actually give better contrast for detection.

[–] MystikIncarnate@lemmy.ca 26 points 5 months ago (1 children)

They addressed this on the Orville. The glowing dots were not eyes. The droid had sensors that did all the work. The "eyes" were an aesthetic addition.

[–] AmosBurton_ThatGuy@lemmy.ca 15 points 5 months ago* (last edited 5 months ago)

"The last thing you need is more desert"

"Excuse me?!"

"As I cannot stutter, I must conclude that you heard me"

Isaac is one of the best parts of that show lmao

[–] Annoyed_Crabby@monyet.cc 7 points 5 months ago

I really like the design of Assaultron from Fallout 4, they didn't have such issue because their eye is placed just above the glowy part, and the glowy part is the head laser that will one shot you.

[–] gamermanh@lemmy.dbzer0.com 5 points 5 months ago

So long as the light isn't coming from BEHIND the lense then you can think of it being like a camera flash

Or just think of it as the power indication LED being made stylish

[–] Ziglin@lemmy.world 2 points 5 months ago

To be fair it makes it harder to tell where the cameras are pointed (assuming they're not wide angle lenses and they're trying to work similarly to humans)

[–] Nomecks@lemmy.ca 27 points 5 months ago (1 children)
[–] samus12345@lemmy.world 22 points 5 months ago* (last edited 5 months ago)

"Come on, you can trust me. You're thinking of the old red light Agimus. Blue light Agimus wants to help!"

[–] hperrin@lemmy.world 21 points 5 months ago
self.setEyeColor(self.isGood() ? 'blue' : 'red');
[–] Mouselemming@sh.itjust.works 17 points 5 months ago (1 children)

Could we do that for people too, please?

[–] Ziglin@lemmy.world 18 points 5 months ago (1 children)

Ooh imagine the chaos at some executive meetings where everyone's evil eyes are blinding eachother.

[–] space@lemmy.dbzer0.com 12 points 5 months ago

The intensity of the red light should be proportional to the level of evil. You could literally put solar panels in those meetings.

[–] gibmiser@lemmy.world 14 points 6 months ago

Well, can't argue that it's not practical

[–] tulliandar@lemmy.world 10 points 5 months ago (1 children)

If they’re evil it presumably means they’re disobeying the first three laws… they may disobey the fourth law too to help cover their other crimes

[–] Kolanaki@yiffit.net 2 points 5 months ago

In the movie the bad ones didn't exactly disobey the laws, they merely found a loophole where by they can protect humans by taking over completely so "human error" couldn't harm humans.

[–] GrayBackgroundMusic@lemm.ee 5 points 5 months ago

I'm all for it. Makes them easier see. FOR SUPER EEEAAARRRTTTHHH

[–] Belzebubulubu@mujico.org 3 points 5 months ago

I think it's supposed to represent errors in the robots code like "I'm evil cuz i'm bugged"