Four interesting things I ran across that all somehow tie into this thread.... Carbon capture getting cheaper Less oil in the tank than we thought, says Saudi Arabia Things just got worse for Boeing- investigators announced that the Ethiopian pilots did exactly what their manuals and checklists said to do, all the way down. Things got worse still for Boeing- one of Ralph Nader's relatives was on board the Ethiopian jet.
I found a link to more details. I won't be surprised to see Unsafe at Any Automation Level appear on bookshelves soon. I've been wondering about the implications of Boeing's problem in relation to car autopilots. What happens to uber stock if somebody mandates a 20-year moratorium on automation for vehicles not constrained by trackage? Hmmm...
When the alternative is 40,000+ U.S. deaths per year from non-automated driving, I don't thing much of anything is going to stop or (seriously) delay the rollout of self-driving car from the large commercial outfits. Though certification issues could very well cause shorter delays. OTOH, the small scale aftermarket / DIY / hacker / hobbiest market, represented under several threads here in PriusChat, could face very serious implications.
Both valid points. Your first point has triggered the following: Let's say things move forward and one day in the future we see 30 or 40 different auto-driving systems, each with millions of examples out on the roads- multiple manufacturers etc. Seems reasonable, right? Now, let's say one of them causes a fatality (sticking with the singular case for now) and it's the sort of wreck where, had a human been driving, the human would lose their driving privileges. Does this mean that we should revoke the driving privileges of every instance of the autopilot running identical code? And if not, why? It strikes me that there's a case to make sure each instance is a little bit different, programmatically speaking. Yes, that's a horrific prospect for traditional software development, but possibly just another place where AI can prove its worth. Still so much to work out with this stuff...
I'm going to say no, at least initially. We don't hold drivers and automobiles to anything resembling commercial air safety standards. We can produce a net savings of human lives as soon as self-driving systems are significantly safer than typical human drivers. We shouldn't delay that potential savings by holding self-driving cars to aviation standards from the outset. Later, as the technology and results improve, maybe. But setting the standard too high at the very start will cost human lives by keeping error-prone humans behind the wheel needlessly long. BTW, hasn't Uber already had such a fatality, the bicyclist in Arizona? They had to halt testing during several months of review.
1st - a human typically loses their license via fatality only after a DUI - or reckless driving. Those scenarios are volitional, not just negligent. 2nd - identical code, does NOT - an identical death scenario make. You would need the same Sun's Azimuth, same amount of clouds or lack thereof, or same night conditions, same weather conditions, same type of car, same tires, same wear on those tires, same condition of the car, same type of vehicle or pedestrian being hit, same location of fatality ...... on & on & on .... then - if that 1 in a trillion scenario ever DID happen .... the "corrective software" would have to cause another similar / 1 in a trillion scenario. The reckless, or drunk or meth tweaker .... they just need to kill someone. .
A new non-governmental organization to assemble and synthesize 'actionable' information about US climate change: Science for Climate Action Network (SCAN) They are in early developmental phase. They are careful not to describe origin of this group in great detail. Group was initiated by US govt. in 2015 and disbanded in 2017. I'll let you guess why because you have only a small chance of guessing wrong. If SCAN comes up with particularly notable recommendations you are likely to hear about them elsewhere.
I get what you're saying, but somehow it doesn't feel right not to hold drivers (human and otherwise) to the same safety standard. After all, despite recent events airplanes seem to be killing about 40,000 fewer people per year than cars. Granted we have a lot more cars than planes; the ratio is very skewed by trip expense and availability. I suspect no real equivalency is possible.
If we held drivers and cars to the same standards, cars would become very highly expensive, roads would require very costly improvements, and most drivers would lose their licenses. The loss of easy access to mechanized transportation would cause massive economic damage. Besides, most people are quite acclimated to, and desensitized to, the dangers of motor vehicles. Similarly for alcohol. These deaths just don't seem nearly has horrible and scary as deaths from certain other causes such as airplane crashes. Or firearms. So many folks just keep increasing their use of those bright little handheld screen thingies while driving. In short, lives are not similarly valued, different risks are not mentally scored by similar yardsticks. But continuing discussion on this track will lead to banishment to FHoPol.
What are the precise scenarios in which a person loses driving privileges after a crash? Cases that involve DUI, or medical cause, wouldn't apply to a non-biological system, and computer vriuses and other malicious software are work of outside agents. Not having insurance isn't in the purvey of an automated system, and it isn't a cause for crashes. Reckless driving covers a broad range of offenses, but does a human drivers typically lose their license immediately after a crash, or is a process that has to go through the courts? In aviation, the fault needs to be grievious in order to ground a fleet after just one incident. Usually, one isn't enough to determine a hardware or software design defect as the cause.
Boeing mea culpa included mention of a second software 'issue' identified that would take additional time to address. Frankly speculating from having read the ASRS reports, it could be about auto throttle implementation.
It's a ridiculous national embarrassment is what it is. I can see comac making a play to sell Ethiopian a whole new fleet of C919s. Sure, those don't have a proven record, but it's possible that comac has a culture of safety where it is now pretty well proven that while Boeing has one, it is subordinate to shareholder dividends. A deal like that would put a number of 737NGs up on the market, maybe enough for Southwest to extend their time in that series for a while. I fly a lot for work and put great faith in the machines and people that get me around. I use multiple airlines and I pay attention to what equipment is to be used for a given trip- sometimes that helps me pick seats or out-the-window views I want. The freedom to buy tickets here or there isn't a lot of influence, but it's something. I don't care if they fix that airplane tomorrow- it's going to be a long time before I pay for a ticket on a 7Mx.
Lion Air was 2018 October 29. If second (so similar) one had come sooner, I confidently speculate that MAXs would have been grounded sooner. In other words, no third one. Not in the previous phase. And that's where it all gets hazy. Boeing's preferred outcome includes. At least one software patch. All MAXs get AoA disagree indicator on display (previously an $80,000 option). Additional pilot training (who pays for it?). Return to air, do not know, let's say first half of 2019 (items above are difficult to 'time'). No further order cancellations on MAXs (about 1% of orders cancelled so far). A slap on wrist for 'improper role' in certification process. Not being bled out by 'wrongful death' lawsuits. Worse outcomes for Boeing have much larger downsides. Returning to first point, if there is a 3rd crash after fixits ... well ... it's gonna be at least story of decade.