228
Tesla scraps its plan for a $25,000 Model 2 EV
(arstechnica.com)
This is a most excellent place for technology news and articles.
Just to be clear, "Full Self Driving" is the marketing name for the product. You are instructed to keep your hands on the wheel at all times and Tesla accepts no responsibility at all if it screws up (unlike Mercedes, who takes responsibility for their level 3 autonomous driving service).
And for other people who happen to read this, the only reason Tesla may seem ahead with their technology is that they just don't care about safety. Tesla won't have a safe product until they actually accept responsibility for their product's failings.
Their infotainment system and app are pretty great compared to some other brands.
I'm currently driving a VW id5 and it's like they've never designed any kind of software interfaces at all. Example:
Not to hate on VW engineers but goddamnit guys. Get your shit together and hire a UX expert. Shortly drove a BMW 1 series before the VW and the infotainment was a lot more practical to use.
I really don't get why automotive electronics makers are allergic to having a proper UX team, other than no one else in the industry has one either so it's not a competitive disadvantage.
My suspicion is that it's because the shots are called by people who worked their way up doing automotive electronics. As in the microcontrollers inside of engine control units. So UX is kinda foreign.
which is hilarious because they're pushing us to touch screens when these devs all grew up on physical interfaces, you know, the ones that worked? goddamn give me switches and knobs any day over touch screens
As best I can tell, the touchscreen is added at the concept phase by folks who mostly know what's going to make people look at the car and want to buy it, several years before the car hits the market and well before the actual car electronics teams are involved.
So, yeah, car UI/UX sucks right now because we're seeing all of the things added to cars a few years ago in response to Tesla and implemented by people who think that just because they programmed a random car-focused microcontroller back in the day that this means that they understand all of the layers involved in a modern Linux or Android or Windows embedded car electronics unit including layer 8 of the OSI stack (meaning: interfacing with humans)
But, yah, dono. I don't actually have my own car. My spouse got a Mazda a bunch of years ago now and it has actually a pretty good touchscreen interface with physical controls such that if you want to dig into stuff, you can touchscreen but all of the common stuff is switches and knobs. The generation before that had way way too many buttons and it was just gag-me-with-a-spoon. The generation after that removed the touchscreen because the leadership at Mazda decided people were just not to be trusted with a touchscreen and I feel like they went a little too far in the wrong direction. Meanwhile, in airplane cockpit design, they put great pains into having you be able to navigate by touch where necessary such that all of the knobs are differently textured or shaped. And, as I said, I don't actually have my own car, but I have to say that if I did have a car, I'd want it to be designed like that.
FSD beta is level 2 which still counts as a driver assist system. That's why it's on the driver's responsibility. Level 3 means you can do other stuff while the car drives itself. If Tesla was marketin FSD beta as level 3 then by definition they would need to take responsibility when it fails. So far there's only one death linked to FSD beta so I don't quite get where the "they don't care about safety" is coming from. I'm pretty sure V12 is already a safer driver than a human. When FSD beta fails it generally means it got stuck somewhere, not that it crashed and killed the passengers.
This is the key. I've actually been saved a few times now by FSD catching something I didn't see, like some deer. I'm collecting videos of the things it does that impress me to share when my trial is over.
Same. There is a pedestrian who is still alive today because FSD saw them when I was blinded by a some asshole's lifted truck lights.
Like sure fuck Elon, but why do you think FSD is unsafe? They publish the accident rate, it's lower than the national average.
There are times where it will fuck up, I've experienced this. However there are times where it sees something I physically can't because of either blindspots or pillars in the car.
Having the car drive and you intervene is statistically safer than the national average. You could argue the inverse is better (you drive and the car intervenes), but I'd argue that system would be far worse, as you'd be relinquishing final say to the computer and we don't have a legal system setup for that, regardless of how good the software is (e.g you're still responsible as the driver).
You can call it a marketing term, but in reality it can and does successfully drive point to point with no interventions normally. The places it does fuckup are consistent fuckups (e.g bad road markings that convey the wrong thing, and you only know because you've been on that road thousands of times). It's not human, but it's far more consistent than a human, in both the ways it succeeds and fails. If you learn these patterns you can spend more time paying attention to what other drivers are doing and novel things that might be dangerous (people, animals, etc ) and less time on trivial things like mechanically staying inside of two lines or adjusting your speed. Looking in your blindspot or to the side isn't nearly as dangerous for example, so you can get more information.
The Mercedes system is limited to a few highways in that mode. It doesn't drive around town like FSD does.
Yes, Tesla's service that they call "full self driving" is like other driver assist services that you can use anywhere.
Mercedes is unique in offering completely autonomous driving in some select areas and they will take all responsibility for the car's driving.