Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try !politicaldiscussion@lemmy.world
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
Fully self driving cars. Turns out it's a lot harder than we thought to build a system that doesn't get confused by edge cases.
By the time they are widely legal most people will probably (hopefully) have realized how stupid car dependency is.
It doesn't have to be perfect. It just has to be better than humans.
And it is.
who's liable when it crashes? And it's "better" than human drivers in very limited situations with a human driver behind the wheel to take control.
I'd say if the human is supposed to observe and take control then the human is liable unless something about the autopilot made it impossible to intervene (e.g. no time to react). If it's a completely autonomous autopilot then ofc the manufacturer is liable, who else could it be?! But autopilots would probably have to pass some safety tests before being allowed on the road, and you'd have to prove negligence or malicious intent by the manufacturer (e.g. faking test results). This would be similar to things like medicine, where the manufacturer just can't guarantee 100% safety.
Regarding "better", afaik it's on average. So if you let 1000 humans and 1000 autopilots drive 1000 miles each the autopilots will produce less accidents overall. Idk if autopilots get better or worse by allowing human intervention, a human could also take control at the wrong moment after all.
Tesla has allegedly played with that by disengaging autopilot something like half a second before a crash, so it doesn’t add to statistics of crashes that occurred while it was on.
Waymo and cruise are already on some roads without any human drivers
This is potentially the killer app of self-driving. If it gets safe enough, the company offering self-driving cars can take responsibility for insurance (so long as you use the self-driving feature).