Vehicles go through a much more stringent software testing process vs your computer or phone. If you app doesn't open or crash, it's annoying. If a safety feature isn't working correctly on a vehicle, it can have serious consequences.
Welcome to AlpineZone, the largest online community of skiers and snowboarders in the Northeast!
You may have to REGISTER before you can post. Registering is FREE, gets rid of the majority of advertisements, and lets you participate in giveaways and other AlpineZone events!
Sorry I can't do that Dave.
Interesting that the article changed and the Tesla response disappeared.
I'd be interested in seeing further details—the data logging from the car should indicate what inputs the 'driver' attempted to provide (eg. steering-wheel / pedal) while the car was in autopilot mode. Sometimes the simplest answer is correct, e.g. with the Audi unintended acceleration where the #1 issue was user error and pedal confusion.
Interesting that the article changed and the Tesla response disappeared.
I'd be interested in seeing further details—the data logging from the car should indicate what inputs the 'driver' attempted to provide (eg. steering-wheel / pedal) while the car was in autopilot mode. Sometimes the simplest answer is correct, e.g. with the Audi unintended acceleration where the #1 issue was user error and pedal confusion.
Hey can anyone say 737 MAX 8?
Turns out it's now been proven that computer software is very efficient when it comes to killing large numbers of humans.
Good luck to those who want to put their lives in the hands of a machine. Guess you forgot that machines are made by humans and as such are subject to abject failure.
I knew someone would bring back this thread ....
The person who invented the wheel has helped countless people and killed a lot in the process . I think Orville and Wilbur would be horrified at what their invention has morphed into . Someone should be doing jail time on this one ! They should have issued an AD and admitted the fix was more complicated . Make damn sure the pilots are aware of the problem .
I deliberately waited until the preliminary investigation into the Ethiopian crash was completed.
And I think that all the facts surrounding Boeing's changes to the 737 which were made with nothing but profits in mind will come out...turns out that Ralph Nader's niece was among the passengers killed on the Ethiopian flight. Unsafe at any speed!
I'm sure glad I don't own any Boeing stock.
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.
http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.
http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html
https://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash
Both these articles seem to blame the automation, but the pilot error is absurd. A stall and stall recovery is literally day 1, lesson 1 in flight school. You can't just rely on humans all the time either.
That Air France flight was on an Airbus by the way, so it isn't unique to Boeing.
Also, I'm probably never flying Air France.
https://en.wikipedia.org/wiki/Air_France_accidents_and_incidents
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.
http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html
https://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash
Both these articles seem to blame the automation, but the pilot error is absurd. A stall and stall recovery is literally day 1, lesson 1 in flight school. You can't just rely on humans all the time either.
That Air France flight was on an Airbus by the way, so it isn't unique to Boeing.
Also, I'm probably never flying Air France.
https://en.wikipedia.org/wiki/Air_France_accidents_and_incidents
How do the experienced pilots get their experience?As a software engineer, my opinion have always been computers should focus on gathering the information and formulates a best guess. But human should always have the final decision.
Now keep in mind many human are very poor decision makers even with perfect information. So allowing the computer to actually execute the “best guess” may lead to better outcome on average. On that basis, I’m a firm believer of “computer as driver”, which I think will be a reality no matter how many people hate the idea.
Still, human should have the final say, by overriding the decision proposed by the computer if they so choose. That option must be made available so those few humans who do make above average decisions are allowed to shine over the mediocrity of the computerized decisions.
Every time I book a flight involving a regional operator, I cringe. Knowing how poorly trained SOME of those pilots are, I wish they make smart computer faster, so that pilotless planes will become a reality. On the major airline? I’d like the planes still allow pilots to fly them the old fashion way. Most of those pilots are often better than the computer. And they are at their best when the computer help without taking over.
As a software engineer, my opinion have always been computers should focus on gathering the information and formulates a best guess. But human should always have the final decision.
Now keep in mind many human are very poor decision makers even with perfect information. So allowing the computer to actually execute the “best guess” may lead to better outcome on average. On that basis, I’m a firm believer of “computer as driver”, which I think will be a reality no matter how many people hate the idea.
Still, human should have the final say, by overriding the decision proposed by the computer if they so choose. That option must be made available so those few humans who do make above average decisions are allowed to shine over the mediocrity of the computerized decisions.
Every time I book a flight involving a regional operator, I cringe. Knowing how poorly trained SOME of those pilots are, I wish they make smart computer faster, so that pilotless planes will become a reality. On the major airline? I’d like the planes still allow pilots to fly them the old fashion way. Most of those pilots are often better than the computer. And they are at their best when the computer help without taking over.
Not quite.
Both accidents are related to cockpit confusion on what the automated system was doing, coupled with sensor failure. Boeing took it further with a stronger mandatory stall response.
With Air France, the FO's low experience, and almost zero real world non-automated experience hindered his recognition and recovery from a stall. There was no crew coordination. No one said "My Airplane", which essentially means get your effing hands off the joystick. And as the Vanity Fair article points out if you read far enough, the inherent danger of non replicated movements for both joysticks, ie they could each provide differing inputs, was an accident waiting to happen. I fault the FO for 1) over controlling, a sure sign of low experience, 2) not discussing his control inputs, 3) Not understanding the correct assertions of the NFP FO #2, and lack of assimilation of data through other instruments. I didn't see it in the article, but those sensors are always heated, and I didn't see any discussion of whether the heat was turned on. Sorta important.
This isn't garden variety training "Now we're going to do a stall demonstration" stall recovery. It's in the weeds, confused cockpit, defective instrument, full IFR stall recovery. Few of you understand the difference, but it's always amazing how many accidents result from ignoring the first rule of aviation: Fly the damn airplane. 3 thousand hours of autopilot operation does not get you the needed experience to hand fly the damn airplane, and that was the problem on Air France. The FO simply couldn't fly a partial panel, and he couldn't fly without over controlling. In fact, there was no real problem at all until the FO pitched the nose up for no good reason. In addition to all the other snowball factors.
Automation is great, but you still need solid basic aviation skills and a solid background in weather, icing, turbulence, manual instrument flying, instrument failure, engine failure, fire, the list goes on... because stuff breaks. Because some mechanic, refueler, baggage loader or avionics technician didn't leave things quite right. In the recent 737 crash, the ability and training to turn off the defective autopilot and fly the damn airplane would have saved that flight.
Turns out there is such a thing as too much automation, (Together with a false sense of security and woeful lack of needed experience) and it's a real shame so many had to die to make that point.
In the “old days”, which is now but quickly passing, a lot of the commercials pilots came from the air force, where they actually flew planes. But going forward, I don’t know.How do the experienced pilots get their experience?