this post was submitted on 28 Aug 2023
432 points (97.8% liked)
Technology
59251 readers
3055 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The driver is always responsible for using the tools within the car correctly and maintaining control of the vehicle at all times.
Either way the driver would be at fault. However, the driver might be able to make a (completely separate) case that the car’s defects made control impossible, but since the driver always had the option to disable self-driving, I doubt that would go anywhere.
Just like you don’t get off the hook if your cruise control causes an accident… and it doesn’t matter how much Tesla lied about what it may or may not be capable of, because at the end of the day it’s always the driver’s responsibility to know the limitations of the vehicle and disable the feature and take control when necessary.
Which is exactly what this case is claiming, that the software is defective.
And what happens when we progress beyond Level 2 or 3 automation? Then the car is making choices for the driver, choices the driver may not have any say in or realistically be capable of reacting to in an emergency?
Deferring responsibility to the driver under any scenario is a cop-out. We have a long history of engineering qualifications and regulations to ensure safety of the populace, engineers and architects design structures to be safe, plumbers have to plumb to code, heck even cars themselves have a mile long list of compliance requirements. All to ensure the thing that companies build aren’t killing the population, and when they do someone is responsible.
Yet as soon as we start talking about software, “not my problem dawg.”.
This is a guy who was using a glorified cruise control (which is all AP is) at high speed whilst watching a DVD instead of looking at the road.
The software can only help so much. There's a reason why there are laws requiring attentiveness checks now.. people are reckless
People are only reckless because they believe Teslas false marketing claims.
The car doesn't "just drive itself", it isn't even close to "just driving itself". The advertising claiming so is much more at fault than the driving watching a movie.