Source: This Test Shows Why Tesla Autopilot Crashes Keep Happening The first thing that happened when I drove a Tesla on Autopilot was an instant, unsettling feeling of not being comfortable in the car at all, thinking it’s always a moment away from crashing. Slowly, I got used to it and calmed down, just like everyone else I’ve talked to who has used Autopilot. This video from a British testing group shows exactly why that is a problem, and why we’ve seen the kind of Autopilot crashes that have been blowing up in the news. The test is pretty simple: A Tesla on Autopilot follows another car ahead of it. The car in front moves out of the way. There’s a stopped car in front of it. The Tesla hits the stopped car. Thatcham Research took the BBC along for one such demonstration using an inflatable dummy car as a stand in for an actual stopped or disabled car. That’s why both made it out unharmed when the Tesla absolutely plowed . . . This reminds me of the Saturday morning cartoons where the character holds a red, bullfighter cape, and eggs the 'bad guy' to attack. Pulling the cape away, there is an anvil or other immovable object that the 'bad guy' runs into. GIF URL: https://media.giphy.com/media/9VnK2SUebgetTc9X7B/giphy.gif It is almost as bad as trying to paste the URL of a Jalopnik article. A simple cut-and-paste that fills the posting with a huge amount of HTML text. Bob Wilson
Yes, the test is true. The issue is not with Autopilot, the issue is with Adaptive Cruise Control. The same thing would happen in my Prius if I had ACC engaged. The problem is not with the car of whatever brand. The problem is with an inattentive driver who does not understand how ACC works. If a driver does not know how to use a feature properly, then they should not use that feature. The people who conducted this test, should have conducted the test with several different makes of automobiles with ACC engaged. Since they didn't, their test has no validity. It would be interesting to know who funded this test, - competing auto maker, oil industry, Faux Gnus?
That is a pejorative statement. There have been no head to head tests comparing the Tesla system and the Cadillac system. Your statement of "far superior" is biased and not based on objective facts. In fact, the Cadillac could not even compete in the referenced test because the Cadillac system only works on a limited number of programmed highways.
Actually, Tesla tested the "eye contact" driver attention monitoring and they found that "hands on the steering wheel" was a more proactive form of monitoring.
Maybe if we tested new drivers and forced new drivers to take high proformance driving training instead of handing out driver licenses to anyone that is breathing we would have safer roads without out the need for all this technology that's is still being beta tested on the public.
Youtube shows 2 or 3 comparison videos including the one you failed to comment even though you are aware of its existence. I'm not trying to start anything. I'm aware of your stance.
You failed to post the rest of my comments where I said, based on objective facts. Yes I am fully aware of some Youtube comparisons done by Youtube bandits that are based on subjective impressions. They did have some nice things to say about Tesla's system but failed to acknowledge Cadillac's glaring deficiencies. These unscientific comparisons were done by "ICE guys" with a bias for ICE vehicles. I would love to see a scientific test done by research scientists comparing Tesla's TACC with Toyotas ACC (or Cadillac or Nissan or any other car maker). That would be the best way to put this debate to rest. Having experience with both Tesla and Prius systems and my unscientific impression is that both systems will behave exactly the same in the situation referenced in the video above. There is a lot of FUD (fear, uncertainty, doubt) being spread about Tesla in this sub-forum and I will not let it go unchallenged. Certainly, there are a lot of ways that Tesla can be improved. Spreading false, misleading (like the video above) information, or biased assumptions is not the way to do that. A few Tesla drivers have been treating Tesla's Autopilot as a FSD (Full Self Driving) feature. The have been in serious accidents and a few have lost their lives. Autopilot is NOT FSD. It is an advanced version of ACC and LKA (Lane Keep Assist) which is an AID to a fully engaged driver. Scientific national research and statistics shows that Tesla's with their autopilot is a much safer system than other vehicles, and much safer than a Toyota Prius. We need to be honest and fair here.
Which is fine. We need people like you to separate what's true and what's FUD (Again, drawing comparisons to the early days of Prius... you're dealing with the same things we did back then). I'm in agreement with your last paragraph. I'm in agreement with your 2nd post. PCS wouldn't have had worked either and would've crashed into the stationary test car too. However, there hasn't been much scientific, objective comparison tests. Most experiments conducted so far have been to test the effectiveness of collision avoidance systems and whether they can decrease the number of collisions. I haven't seen an objective test comparing systems between manufacturers other than the IIHS' test for "Basic", "Advanced" and "Superior" ratings for forward collision warning systems.
It might be, until people figure out how to hack the system. Like by hanging weights off the steering wheel to trigger the 'hands on' sensors.
I agree with all of that. But why on earth would they name the system "autopilot" unless they wanted people to think it was fully capable? Even if you give them a mulligan there, why haven't they dropped everything to rush out and expunge the term "autopilot" from their dashboards and marketing materials? That sort of slippery "it is but it isn't" is utterly corrosive to my perception of their product's capability and by extension my trust in them as well. I don't think I'm alone.
Even the term "Auto Pilot" in aviation is used relieve stress from the pilot, he is still suppose to monitor his aircraft and is responsible.
Some people have been doing that, but you can't blame Tesla. Some people will figure out how to hack the Cadillac eye contact system, but you can't blame Cadillac. Stupid people shouldn't blame the car when they do stupid things. Tesla uses the word "Autopilot" for the same reason that aircraft and boats use the word autopilot. I have an autopilot on my boat. It is used to get from point A to point B. If there is another vessel or dock between A and B, I will hit it unless I personally take manual corrective action. Autopilot whether on a car, airplane, or boat is an operator aid to relieve the fatigue of constant manual steering. It still requires an operator to be fully engaged to take evasive action if necessary. I am REQUIRED to maintain a constant watch when I put my boat into autopilot. It is NOT FSD. It is NOT collision avoidance in a car, airplane, or boat. Why haven't the aeronautical and marine industries expunged the word "autopilot" from the lexicon? Because it is obvious what the word means. Tesla gives written instructions as well as visual and audio warnings on the capabilities of their "Autopilot" system (more than I get on my marine system). If drivers choose to ignore those warnings, you can't blame the car or Tesla.
We are in agreement here. We need a fully functioning collision avoidance system for every automobile on the road. Many car companies have put some research into the concept but non of them are fully functional yet. Tesla's AEB system will reduce speed by 25 mph to lesson the severity of impact. Toyota's ACC reduces speed initially (by how much, I don't know and not willing to experiment ) but will not prevent impact. Nissan warns that their collision avoidance system only works under 25 mph. At this point, Tesla is leading the way in a fully autonomous driving system which will include collision PREVENTION. While they are doing that, there is a lot of misinformation and false equivalencies being spread, including on this thread. Tesla's system is not a Beta system anymore than Toyota's ACC is a Beta system. Tesla's system is a work in progress and when used as directed will save lives. The IIHS has always given Tesla 5 out of 5 stars for collision safety. They give the 2018 Prius 4 out of 5 stars. That is an objective test.
That's entirely reasonable, but then where's the training? If use of autopilot is viewed as an alternative means for a driver to operate a car, shouldn't those drivers be required to complete training? Maybe it would be appropriate for states to create a new license class or endorsement. To say it another way, I'll buy your argument that it's fine to leave the label in place if the operators of this system treat it with the same attentiveness as is required to use aviation and marine autopilot systems. But has Tesla done anything to reinforce or support this? A click-thru message on the stereo is meaningless at this point. A chapter in the manual or a pamphlet? Please. Positive identification of the driver's seat occupant coupled with classroom instruction and license endorsements sound more appropriate to me.
You need training to keep your eyes on the road and hands on the wheel? you get that in drivers training to get a license to drive a car. Again, you've got it all wrong! It is NOT an alternative way to drive a car, it is NOT an alternate to being a fully engaged driver. There is no alternate. You are trying to make this into FSD. IT IS NOT FSD. I received no training on how to use the autopilot on my boat, I just read the very thorough manual. Some people choose not to read the manual and they crash their boat. You can't blame the autopilot or the boat, the operator is fully responsible for the results/ Tesla is taking it one step further, there won't be a need for anyone in the driver's seat, it will be FSD. They aren't there yet and it may be years before they get there, but it will happen. In the meantime, they cannot fix stupid drivers, too stupid to read a manual or heed visual and audio warnings.
Perhaps this was brought up in the other Tesla thread, but the definition of autopilot is like theory's. The one held by the public is different than those used by experts. In light of that, autopilot may not have been the best term to use considering the typical level of driver training in the US. You say that as if the difficulty to do so is the same between the two systems. In the case of Tesla's, and many others are using sensors on the steering wheel, system, it doesn't need to be hacked. A person can just leave their hands on the wheel while watching a movie on their tablet. Yes, people shouldn't blame the car when they are at fault, but they will, and considering the level of driver training in the US, doing something stupid by a driver is likely. The manufacturers need to do more than the minimum for legal CYA to make the drivers aware of these systems' limitations. Aside from experimental aircraft, autopilot systems have to be certified by the FAA. Then the commercial plane companies offer training for their clients' pilots. I don't expect it to happen, but it is time to start having official testing for these various advanced driver's aids.
It is clear that you have misunderstood me. I'm not trying to make it into anything. I'm a fan of the technology, and I'm genuinely curious about how this will further develop. You and I both know it isn't FSD today, but others are making fatal mistakes on this point. I agree with you that this isn't the carmaker's fault, but I wouldn't mind if they tried a bit harder to prevent it in the first place.