Some good info mixed with naive misconceptions. We should put the moral and ethical issues to the side in all these discussions. The number of times these situations come into play is minuscule compared to the number of deaths due to DUI and distracted drivers. Mike
Yeah.....let’s move ahead and worry about the ethical issues later.....says everybody...before the law of unintended consequences kicks in. I don’t agree with KB about his stance on government either, but I figured that since he approaches many issues from a ‘slightly’ left of center point of view that it might be more palatable for this audience - AND I gain an insight into some of the technical hurtles of the AI involved in getting us from here to some far off dystopian future where humans will not be allowed to travel freely without a HAL (human autonomous legal) equipped car. Remember! He’s the “Big dot.gov” guy and I’m more of a libertarian, but even I see that L5 is not going to be fully implementable without somebody setting up some guardrails.....Pun almost unintended..... I always used to use the school bus full of kids on a winding road in my illustrations of the complexities of autonomous coin flipping, and everybody always forgets Einsteins law of intellectual infinity when they’re tinkering with AI. One is as easy to underestimate as the other, but either way it’s fairly clear that L5 automation isn’t juuust around the corner. ....and we haven’t even covered OTA subscript.....er....(*) I mean “updates”....
Your argument, whatever it is, was not well presented and difficult to follow. FSD (Full Self Driving) development is a journey, not a destination. It will NOT spring forth in full bloom when released initially. Anyone who expects it to be fully developed upon release should not purchase it, should not use it, and should not even give it serious consideration. I am willing to be an early adopter and BETA tester - for better or worse. Like Tesla's venture into production BEVs that is disrupting an industry, this is the FUTURE.
We can legislate standards for safety, fail safe mechanisms and algorithms, etc. But to freeze development, as I've heard some people call for, when there are 100 deaths per day and many times that of serious accidents per day is a bit absurd. We just take it for granted that they are "accidents" when roughly 90% are caused by human error, distracted drivers, drunk drivers, etc. The "ethical" issues that people are worried about (do I run over the guy with the helmet in the other lane or the guy without the helmet in my lane) are so rare that we should, in the future, be happy that we have already solved most of the easy 90% of deaths and now we can solve the last 10% that may involved ethical issues or equipment failures. To be clear...the technology is no where near good enough yet, IMO. It needs lots more work, lots more testing, lots more miles with safety drivers, lots more geo-limited and weather-restricted testing before it is ready. If there was another killer of so many people, so many teenagers and kids we would somehow be outraged. Mike
Absolutely concur! All people like me are advocating is that we think a little before going from L4 to L5....and there WILL be bumps along the way. If we would have halted off passenger jet development when de Havilland Comets started to kill people then that probably would not have been a good thing in the long run, and L3-L4 tech will save many lives.
We disagree: Actually I want L1-999 to get in cars as soon as possible. I've already got +50 years of driving with 'L-nothing' and it s*cks. Real-life is the ultimate lab and the ONLY way any technology works is in the real world. But as someone who already uses: dynamic cruise control - the automated accelerator automatic collision detection and braking - the don't run into the idiot lane keeping nanny - don't run off the road to die hitting a tree or ditch If you want to play a 'game', fine, there are simulators galore, enjoy yourself. But I want and have already paid for TSS-P level assistance and will pay for better as soon as I can afford it. Your death and injury in the absence of any L-n is sad but in my case, tragic. Just get out of my way. Bob Wilson
When I said that the technology is not "good enough" I meant for L4 or L5 autonomous driving. As an "aid" to human drivers we should expand it greatly, now, IMO. Mike
Keep the ordinary driver controls and legal responsibility. However, get the technology in the car. Bob Wilson
I agree that holding back this technology is the real ethical dilemma. Can you imagine how much money this country would put into defense if it knew it could prevent 40,000 American lives lost (not to mention the other tens of thousands that are forever changed by injury) every year? My guess is the legislation would be pushed through unanimously with billions and billions being spent, and very quickly. Yet we have people that are pushing back on pursuing autonomy. I just don't get it. I agree with Mike that we aren't quite there yet, but we are pretty close. We should be seeing billions in grant money from the government to make this happen sooner, but that would require a government that has a clue.
That is why the potential for computerized driving is so compelling. 90% of the time, the BETA version of Autopilot in my Tesla drives better than I do.
Humans already do an excellent job in driving. Just not everyone one of them and not all the time. Mike