438
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 17 Sep 2023
438 points (72.2% liked)
Technology
59161 readers
2091 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
So he demanded that the driver assistance software be as safe as possible before public release? paving the way for full self driving 6-7 years later? is this a bad thing?
If he demanded it was as safe as possible, he wouldn't have refused to add lidar or radar capabilities.
I thought the “needs lidar” debate was settled years ago? Lidar cannot read signs. It is also prohibitively expensive to put in vehicles. If you’re going to drive with a neural network you need as much training data as possible, which means as many sensors in as many vehicles as possible.
If your cameras detect something the lidar does not, you trust the cameras, every time. Lidar can very easily misinterperet the world. It works great for simple robots who need to know where walls are and don’t need to specifially identify animals, people, obstacles, speed bumps, construction zones, etc.
Theres also the simple fact that humans can drive just fine without having evolved a lidar sensor.
Look at you just parroting Musk's lies. Do you parrot his transphobic bullshit too?
Yes, but if the lidar sees something the cameras doesn't, you trust the lidar.
Actually, no you don’t. Lidar cannot dentify object’s specifically. Tesla does use lidar in their testing/prototype vehicles and they have to find any instances manually where these systems don’t agree. It always falls back to cameras.
TSLA doesn't even pay dividends. Appreciate you pointing yourself out as horribly misinformed.
Then why are you so stridently kissing Musk's ass?
#fuckyourdividends
Evs don't have to drive themselves.
That's not an EV specific thing. Hundreds of people will die TODAY in traffic related accidents, EV or not. We need to shift away from human drivers entirely.
Unless every last vehicle on the road is suddenly converted to cooperative autonomous systems, the human element and unpredictability will still be present. Even then, wildlife, pedestrians, and unpredictable events will pose a challenge to autonomous vehicles.
In a perfect world, FSD would help. In the real world, Tesla's FSD is a beta feature being spearheaded by a stubborn egomaniac that thinks he knows better than the people actually doing the engineering work. And frankly, I'd rather not spend my money for the privilege of being driven into the side of a concrete barrier down the highway because somebody wants to cut costs and place style above function.
Exactly. Also lidar is important in instances where you need millimeter precision. Its useful for calibrating camera systems in self driving cars but in order to drive safely you don’t need that level of detail about the world around the car. It makes no difference if a car or pedestrian is 72 or 73 inches away.
For a multi-ton metal projectile that drives itself (a car), you want multiple data sources to draw a consensus from. Relying on one data source is a point of failure, and that's not acceptable when you have the potential to kill not only the driver but others outside the car.
No he ABSOLUTELY didn't do that, because it's very unsafe and he unleashed not only AP but also that total steaming pile that is "FSD". Which is neither F nor SD.