this post was submitted on 24 May 2025
88 points (82.8% liked)
Technology
70300 readers
4479 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I always said this will be the problem. Self-driving cars will never be perfect. They’ll always have different failure modes than human drivers. So at what point is increased safety worth the trade off of new ways to die. Are we there yet?
At what point is it acceptable to the rest of us? Humans will always prefer the risk they know over the one they don’t, even when it’s objectively wrong
https://fuelarc.com/tech/can-teslas-self-driving-software-detect-bus-only-lanes-not-reliably-no/
edit: it's trivial to find examples of these utterly failing at basic driving. This isn't close to human performance and it is obvious.