Class actions in the Netherlands mostly favor lawyers.
I think, this is a calculation to understand if an upgrade of hw3 to hw4 actually solves the problem or if hw3 must be updated to hw5.
One upgrade is more economical than two, but I would be annoyed for sure as well.
It's run by the person mentioned in the article, and unsurprisingly the domain is Dutch, but seems the same thing will apply in lots of countries if FSD rolls out there too, not just Netherlands.
People don't talk about these cars driving themselves enough imho
Then, on the way home it drove me home on the wrong side of the street and I had to take over. Such a silly mistake.
Similar to what you said; from there on out, it was more trouble than it's worth because you can't let your guard down.
FWIW: My 2026 Huyndai's driver assistance is better than my old 2018 Tesla Model 3's enhanced autopilot.
I find it less cognitive load to drive it myself. It's easier to predict what other vehicles will do than my own. Boo.
I have sympathy with the challenges as I worked in the field.
It always had the feeling of being outside with your toddler by the pool. I can look away but I have 50/50 odds of a dead toddler if I do it for to long.
Their role is to stop the train in an emergency and adjust to speed etc. to track/driving conditions.
Automating their job probably wouldn't even need the complex ML used for self-driving because the context is significantly simpler and relatively well defined. Maybe a team in city might need such a model but it would still be a significantly simpler task than driving a car.
Triggering psychosis is not difficult and the LLM is easily capable of doing that. For a person they soon get freaked out and are likely to summon help. "Johnny started acting crazy and I'm not sure what to do, please come". But the LLM isn't a person, Johnny needs to know more about the CIA's programme to cross breed Venusians with Hollywood stars? Here's an itinerary with the address of a real hotel in LA and an entirely hallucinated CIA officer's schedule.
Next thing you know, Johnny is shot dead by officers responding to a maniac with a fire axe who broke into an LA hotel and was screaming about space aliens.
If you’re driving, your brain can automatically prioritize the importance of things that you see. But since a computer fails in different ways than a human, you lose all automatic prioritization
One such study is "Performance consequences of automation-induced 'complacency'" (Parasuraman, Molloy & Singh, 1993) https://www.pacdeff.com/pdfs/Automation%20Induced%20Complace...
Previous studies had found that a human and a computer performed markedly better than either a human alone or a computer alone - but in those studies failures were quite common, so they didn't give the humans time to get bored or distracted.
When researchers got test subjects to perform a simulated flying task, monitoring a system with 99%+ reliability, they found the humans were proportionally much worse at stepping in than they were on less reliable systems.
Swimming pool lifeguards will often change posts every 15-20 minutes and and get a 10-15 minute break every hour, to keep things interesting enough that they can pay attention. Good luck getting drivers to do that.
Funny, I was going to mention exactly that. I'm a private pilot with a modern autopilot and flying is exhausting. Partly because the piston engine is rattling your brain the entire time but also because you're on high alert the entire time. You're always making sure the autopilot is keeping the plane on the blue (or green) line and is being predictable. And my smartwatch shows my heart rate is usually more elevated on autopilot than not.
A "self-driving" tesla is an adversary you need to supervise to make sure it doesn't take actions you wouldn't expect of a normal car.
As other posters have pointed out, it's like running an LLM with `--dangerously-skip-permissions`: I wouldn't `rm -rf /` my computer (or in the case of tesla, my life), but an AI might.
I do not want to be the 'manager of my car'. That'd be a downgrade from being an actual driver.
Lane Assist, auto-stop-start, cruise control are enough for me and have been available mostly for decades and require a similar amount of attention.
FSD is a busted flush and I can't believe those who got conned by it aren't more vocal.
https://www.faistgroup.com/site/assets/files/1657/j3016-leve...
While FSD's manipulation of controls is impressive -- it is missing a very critical component that is required for self driving: the ability to guarantee whether or not it can make a safe decision. Tesla's FSD still offloads this task to the human driver. Once they can do this more than zero percent of the time, they will have achieved level 3.
It’s because driving on the freeway isn’t FSD, it’s a better version of cruise control, and other companies also offer similar capabilities. Within a city, the thing is a shitshow. It does random things all the time and it’s almost a larger cognitive burden on me to constantly be on the lookout for it to make mistake where I have to take over vs me just driving the car myself. For me specifically, it’s just impossible to drive because it fails to recognize curved streets and a couple of other irregularities just within blocks of where I live.
On a freeway it’s only kind of usable. It switches lanes far too aggressively and for no reason, to the point that it makes the ride uncomfortable.
What I really want is auto steer with lane switching when I signal, which for some reason I could never get working in any mode. It either doesn’t change lanes at all, or changes them arbitrarily of its own volition. And if I change lanes manually it turns off autosteer, which is too irritating to use in practice.
Tesla self driving, in any mode, is a bad product. And I say this as a Tesla fan.
FSD is amazing. Any notion it takes more effort to use it than driving is made up.
On the other hand, when I got FSD trials in the model 3 in the last year or so, it never managed to get more than ~a mile without me having to disengage.
What's worse is this is all going to end up happening again when HW5 comes out and all of the HW4 cars start getting a trimmed down version of the FSD software from HW5, like HW3 is currently receiving.
If I can't go to sleep lying down on the seat as a sole occupant, it's not yet self driving.
Other brands have had self driving features for years now. Some even operate at a higher level of automation.
And that was actual hands-free, while Teslas at the time required you to take putting torque on the wheel to lie to the system.
Even then my 2017 Hyundai did practically everything but steer. Get it on the highway, turn on ACC, and it'll handle the traffic just keep it in the lane. It even did all the stop and go traffic.
1. https://www.tesla.com/customer-stories/cross-country-trip-fu...
1. https://www.thedrive.com/opinion/40604/five-things-my-roomba...
2. https://www.thedrive.com/news/a-tesla-actually-drove-itself-...
Have Tesla and its fanboys overstated FSD’s capabilities? Absolutely. But I’m not saying that FSD is currently good enough that one should expect to have thousands of miles between interventions. I’m trying to convince someone that it has been done. The reason I’m trying to do this is because that same cannot be said for any other self-driving technology available in a consumer vehicle today, so claiming that FSD is no better than competing offerings is not accurate. FSD overhyped? Sure. Late? Extremely. Fraudulent, bordering on criminal? I could see that. But it’s still in a league of its own in terms of what it can do.
Totally fully self driving even though you need not one, not two, but three autonomous driving experts with you. And be sure to have a second car with you when your first autonomous vehicle strands you. Sure sounds like a reliable system ready for the masses to use on public roadways!
I’m not making any claims about FSD’s safety or how ready it is for mass usage on public roads. I am trying to figure out what information would convince you that someone has used FSD for thousands of miles without intervening. Does this count or not? If not, why?
I never doubted it, I just said I don't trust things Tesla states on their website (they're objectively known to lie, especially when it comes to videos about their self driving) and I don't trust randos on Twitter.
I will say though, the people in the article have a vested interest in pushing a pro-AV agenda. But in the end, sure, I guess they probably did have that trip they say they did.
It doesn't surprise me people managed to go thousands of miles without disengaging especially since it sounds like this isn't their first time (flip a coin enough and you'll probably get heads several times in a row after all) and that's nearly all highway miles. I've personally driven many shots on a non-Tesla well over 150 miles hands-free without any disengagements on a system that attempts less than what Tesla does. You pick a route that has easy to get to chargers, you don't venture off the highways much, sure sounds possible to me. In the end though I don't personally see it as that radical of a difference on a road trip. On a nearly 300mi drive I probably drove like 5 of those miles total. Is risking people's lives at the surface street parts with beta software worth that last little bit?
Note, that's several thousand miles of no disengagements on a long, pre-planned cross country drive. Not 10,000 miles of driving around in a city and having all the other randomness of life peppered in. So what are we really measuring here?
I heard the same thing in 2019, HW3 solved all the issues, it finally just works as advertised. That was after HW2 was guaranteed to ship with all the hardware needed for FSD a decade ago, for real this time.
I'll probably wait for HW5, then you'll tell me its really there. This time it won't even run people over, and it actually stops at stop signs more than just 98% of the time.
Personally I try and avoid systems that drive people in front of trains. https://www.youtube.com/watch?v=vMqTmOTtft4
I thought the diver was supposed to keep hands on the wheel in case consuming hits wrong.
That's why Tesla fans buy those weighted gizmos to fool the computer into thinking they're still holding the steering wheel.
Also the EU adopted laws restricting self-driving behavior, making FSD far less capable there. For example, the software cannot exert a lateral acceleration of more than 3m/sec^2. It must also cancel lane changes after 5 seconds after the start of engaging the turn signal. Tesla gimped their self-driving features in the EU & Australia because of this.[1]
It’s only the latest version of FSD (which only runs on HW4) that lacks these restrictions and has been approved for use in the Netherlands. Even then, it requires you to pay attention to the road, so it's not what he paid for.
1. https://electrek.co/2019/05/17/tesla-nerfs-autopilot-europe-...
The math doesn't work out. It should be \euro 20,000,000 in FSD purchases, no?
After paying the full cost and being stuck on old software that had a promise of having the hardware required for it
You can use FSD with HW3 in other countries like Canada.
I gotta say I am continuously amazed how much Musk is allowed to get away with. I know he can get some things done and he is, apparently, skilled manager, fund raiser and bs'er of epic proportions, but I have a hard time understanding how all this didn't catch up to him yet.