When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
In my experience, Tesla's automatic braking and collision warning and avoidance are excellent. The system sometimes brakes before I do in a panic situation and has saved my bacon. It does score higher than any car in that class on the safety tests per the article.
My son's RX does the same thing. Definitely beat my response time more than once.
I couldn’t agree more, in any of these drivers assistance systems the driver just can’t expect to let the car do everything, it requires the driver to override as necessary. Really the only thing I think Tesla is at fault for is the name of their ASSISTANCE package because there are dumb people everywhere that will really think it’s autopilot and hands off. I’m sure other carmakers are too risk adverse to call their systems something as dumb as this because they know people are dumb. I also don’t like it because it’s such a marketing ploy for those dumb people.
This raises a thought, but I don't think the name makes a difference. When I let new people try autopilot, they usually micro-manage it like crazy and watch it like a hawk. The same was true when I had the radar cruise feature on my IS, back in 06 (which by a different name, operates on similar principle). IME, people tend to have a healthy reluctance about letting the system handle itself, before they develop enough confidence to let it do its thing.
The people who tend to do risky (stupid) things, are the ones who actually have a good understanding of the system, which is why they- A. have enough knowledge to try and bypass it and B. feel comfortable enough to let it manage itself beyond its intended use, despite the alerts and warnings that would stop an otherwise newer/proper user. In other words, if we see someone climbing into their back seat or whatever gets caught on video these days, they're definitely not confused about the name of the system or how it functions, it's deliberate and they should be held accountable. Tesla could change the name to "not FSD" tomorrow, and we'd see the same behaviors, because the intent is still there. It's a shame, because it's promising tech and the reporting rarely adds context, because clicks matter.
This raises a thought, but I don't think the name makes a difference. When I let new people try autopilot, they usually micro-manage it like crazy and watch it like a hawk. The same was true when I had the radar cruise feature on my IS, back in 06 (which by a different name, operates on similar principle). IME, people tend to have a healthy reluctance about letting the system handle itself, before they develop enough confidence to let it do its thing.
The people who tend to do risky (stupid) things, are the ones who actually have a good understanding of the system, which is why they- A. have enough knowledge to try and bypass it and B. feel comfortable enough to let it manage itself beyond its intended use, despite the alerts and warnings that would stop an otherwise newer/proper user. In other words, if we see someone climbing into their back seat or whatever gets caught on video these days, they're definitely not confused about the name of the system or how it functions, it's deliberate and they should be held accountable. Tesla could change the name to "not FSD" tomorrow, and we'd see the same behaviors, because the intent is still there. It's a shame, because it's promising tech and the reporting rarely adds context, because clicks matter.
I agree and I have said this before in a different thread. Like anything else, it takes a while to get comfortable and the problem with Tesla's AP or FSD is that it's so good that people get overconfident in it's abilities and the techies love it so much and believe in it so much that they do stupid things to try to prove how advanced it is. The naming of it is just icing on the cake and only reinforces peoples overconfidence in the technology when they get used to it regardless of the amount of disclaimers out there.
You set the speed at which autopilot goes up to. I believe the max you can set it is for 20mph over the limit though.
No. Ultimately, that probably will not happen for reasons of litigation.....if that feature does in fact exist now, it won't last very long. If there is a crash and people inside are hurt or killed, those who produced software that allowed excessive speed will leave themselves wide open for lawsuits. No only that, but in some states, 20 MPH over the posted limit is considered Reckless Driving, although, of course, in this case, a machine, not a human, is doing the actual driving.
No. Ultimately, that probably will not happen for reasons of litigation.....if that feature does in fact exist now, it won't last very long. If there is a crash and people inside are hurt or killed, those who produced software that allowed excessive speed will leave themselves wide open for lawsuits. No only that, but in some states, 20 MPH over the posted limit is considered Reckless Driving, although, of course, in this case, a machine, not a human, is doing the actual driving.
Huh. It's been there since Autopilot inception which is a awhile now. It's far more dangerous to religiously follow the speed limit.