150
No AI apocalypse.
(hexbear.net)
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
Rules:
There's a certain level of risk aversion with these decisions though. One of the justification of salaries for managers who generally don't do shit is they take "responsibility". Honestly even if AI was performing at or above human level, a lot of briefs would have to be done by someone you could fire anyway.
And as much as next quarter performance is all they care about, there are still some survival instincts left. My last company put a ban on using genAI for all client facing activities because a sales guy almost presented a deck with client is going to instantly walk out levels of wrong information in it.
Yeah, that's something I was thinking about. With human employees, you can always blame workers when anything goes wrong, fire some people and call it a day. AI can't take responsibility the same way.