Because AIs don’t share common human values like fairness or justice — they’re just focused on the goal they’re given — they might go about achieving their goal in a way humans would find horrifying.
The idea that if you task a sufficiently advanced AI with making paperclips it'll inevitably turn the universe into a collection of paperclips when that is its only goal.
Sooooo…the next update to AI is sociopathy:
Are you familiar with the paperclip problem?
The idea that if you task a sufficiently advanced AI with making paperclips it'll inevitably turn the universe into a collection of paperclips when that is its only goal.
https://www.decisionproblem.com/paperclips/
Give me more paperclips or at least my time back