• Welcome to AlpineZone, the largest online community of skiers and snowboarders in the Northeast!

    You may have to REGISTER before you can post. Registering is FREE, gets rid of the majority of advertisements, and lets you participate in giveaways and other AlpineZone events!

Will driverless cars help remote resorts?

Glenn

Active member
Joined
Oct 1, 2008
Messages
7,691
Points
38
Location
CT & VT
Vehicles go through a much more stringent software testing process vs your computer or phone. If you app doesn't open or crash, it's annoying. If a safety feature isn't working correctly on a vehicle, it can have serious consequences.
 

kbroderick

Active member
Joined
Dec 1, 2005
Messages
708
Points
43
Location
Maine

Interesting that the article changed and the Tesla response disappeared.

I'd be interested in seeing further details—the data logging from the car should indicate what inputs the 'driver' attempted to provide (eg. steering-wheel / pedal) while the car was in autopilot mode. Sometimes the simplest answer is correct, e.g. with the Audi unintended acceleration where the #1 issue was user error and pedal confusion.
 

cdskier

Well-known member
Joined
Mar 26, 2015
Messages
6,416
Points
113
Location
NJ
Interesting that the article changed and the Tesla response disappeared.

I'd be interested in seeing further details—the data logging from the car should indicate what inputs the 'driver' attempted to provide (eg. steering-wheel / pedal) while the car was in autopilot mode. Sometimes the simplest answer is correct, e.g. with the Audi unintended acceleration where the #1 issue was user error and pedal confusion.

The Tesla statement is still in the nj.com article:
https://www.nj.com/middlesex/2019/0...f-nj-highway-crashes-into-signs-cops-say.html
 

Glenn

Active member
Joined
Oct 1, 2008
Messages
7,691
Points
38
Location
CT & VT
Interesting that the article changed and the Tesla response disappeared.

I'd be interested in seeing further details—the data logging from the car should indicate what inputs the 'driver' attempted to provide (eg. steering-wheel / pedal) while the car was in autopilot mode. Sometimes the simplest answer is correct, e.g. with the Audi unintended acceleration where the #1 issue was user error and pedal confusion.

And some rigging from 60 Minutes...
 

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
Hey can anyone say 737 MAX 8?

Turns out it's now been proven that computer software is very efficient when it comes to killing large numbers of humans.

Good luck to those who want to put their lives in the hands of a machine. Guess you forgot that machines are made by humans and as such are subject to abject failure.
 

Not Sure

Well-known member
Joined
Dec 14, 2013
Messages
2,858
Points
63
Location
Lehigh County Pa.
Website
www.youtube.com
Hey can anyone say 737 MAX 8?

Turns out it's now been proven that computer software is very efficient when it comes to killing large numbers of humans.

Good luck to those who want to put their lives in the hands of a machine. Guess you forgot that machines are made by humans and as such are subject to abject failure.

I knew someone would bring back this thread ....

The person who invented the wheel has helped countless people and killed a lot in the process . I think Orville and Wilbur would be horrified at what their invention has morphed into . Someone should be doing jail time on this one ! They should have issued an AD and admitted the fix was more complicated . Make damn sure the pilots are aware of the problem .
 

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
I knew someone would bring back this thread ....

The person who invented the wheel has helped countless people and killed a lot in the process . I think Orville and Wilbur would be horrified at what their invention has morphed into . Someone should be doing jail time on this one ! They should have issued an AD and admitted the fix was more complicated . Make damn sure the pilots are aware of the problem .

I deliberately waited until the preliminary investigation into the Ethiopian crash was completed.

And I think that all the facts surrounding Boeing's changes to the 737 which were made with nothing but profits in mind will come out...turns out that Ralph Nader's niece was among the passengers killed on the Ethiopian flight. Unsafe at any speed!

I'm sure glad I don't own any Boeing stock.
 

AdironRider

Well-known member
Joined
Nov 27, 2005
Messages
3,487
Points
63
I deliberately waited until the preliminary investigation into the Ethiopian crash was completed.

And I think that all the facts surrounding Boeing's changes to the 737 which were made with nothing but profits in mind will come out...turns out that Ralph Nader's niece was among the passengers killed on the Ethiopian flight. Unsafe at any speed!

I'm sure glad I don't own any Boeing stock.


I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.

http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html

https://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash

Both these articles seem to blame the automation, but the pilot error is absurd. A stall and stall recovery is literally day 1, lesson 1 in flight school. You can't just rely on humans all the time either.

That Air France flight was on an Airbus by the way, so it isn't unique to Boeing.

Also, I'm probably never flying Air France.

https://en.wikipedia.org/wiki/Air_France_accidents_and_incidents
 
Last edited:

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.

http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html

That sounds more like poor training to me. The autopilot shut off and the pilots had no idea what to do to save the airplane and that type of situation will only get worse as more automation makes pilots less needed. Until they are needed and by then the die is cast. Just another example of the same poor outcome that occurred because of an overdependence on technology.

I believe that poor training is also a large part of the recent crashes. I'm hearing that the training consisted of about an hour on a laptop! No flight simulators because Boeing considered that too expensive. Shall I continue?

You make a good point but to me it's just another reason to never set foot on a commercial airplane which is another topic altogether.
 
Last edited:

AdironRider

Well-known member
Joined
Nov 27, 2005
Messages
3,487
Points
63
Haha yeah I get that fear. I have my private and it gave me just enough knowledge to be scared up there. Ignorance is bliss sometimes.

It is most definitely a lack of training, but not being able to identify a stall is more concerning to me than technological dependence. That is pretty basic stuff that most commercial pilots should know forward and backward.

However, it does illustrate that computers are only as good as the humans that program them, and humans are not infallible. In the Boeing scenario that computer couldn't be overridden easily, in the Air France scenario the pilots actively overruled what the computer was (correctly) telling them. The latter is a big concern to me as I fear that will only become worse now post Max8 stuff.
 

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
I guess we all have to accept the fact that nothing will ever be perfect.

I would still rather put my faith in a well trained human than in a machine. Just ask the passengers on the flight that Chesley Sullenberger emergency landed on the Hudson river back in 2009. Zero fatalities.
 

Not Sure

Well-known member
Joined
Dec 14, 2013
Messages
2,858
Points
63
Location
Lehigh County Pa.
Website
www.youtube.com
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.

http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html

https://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash

Both these articles seem to blame the automation, but the pilot error is absurd. A stall and stall recovery is literally day 1, lesson 1 in flight school. You can't just rely on humans all the time either.

That Air France flight was on an Airbus by the way, so it isn't unique to Boeing.

Also, I'm probably never flying Air France.

https://en.wikipedia.org/wiki/Air_France_accidents_and_incidents

Sometimes the air sometimes the airplane . I had a conversation about 20yrs ago with a friend who flew Airbus. He was complaining about control input on aileron . He was on final approach and had a wing dip ,with full control input he needed more opposite aileron to level the wings . Apparently they limit amount of input to less than he needed for that situation.
 

mister moose

Well-known member
Joined
Oct 11, 2007
Messages
1,086
Points
48
I don't know man, this seems like a damned if you do, damned if you don't type scenario. Just 10 years ago Air France flight 447 crashed into the atlantic solely because the automated systems shut off and the pilots couldn't identify a clear stall.

http://www.slate.com/blogs/the_eye/...fety_paradox_of_airline_automation_on_99.html

https://www.vanityfair.com/news/business/2014/10/air-france-flight-447-crash

Both these articles seem to blame the automation, but the pilot error is absurd. A stall and stall recovery is literally day 1, lesson 1 in flight school. You can't just rely on humans all the time either.

That Air France flight was on an Airbus by the way, so it isn't unique to Boeing.

Also, I'm probably never flying Air France.

https://en.wikipedia.org/wiki/Air_France_accidents_and_incidents

Not quite.

Both accidents are related to cockpit confusion on what the automated system was doing, coupled with sensor failure. Boeing took it further with a stronger mandatory stall response.

With Air France, the FO's low experience, and almost zero real world non-automated experience hindered his recognition and recovery from a stall. There was no crew coordination. No one said "My Airplane", which essentially means get your effing hands off the joystick. And as the Vanity Fair article points out if you read far enough, the inherent danger of non replicated movements for both joysticks, ie they could each provide differing inputs, was an accident waiting to happen. I fault the FO for 1) over controlling, a sure sign of low experience, 2) not discussing his control inputs, 3) Not understanding the correct assertions of the NFP FO #2, and lack of assimilation of data through other instruments. I didn't see it in the article, but those sensors are always heated, and I didn't see any discussion of whether the heat was turned on. Sorta important.

This isn't garden variety training "Now we're going to do a stall demonstration" stall recovery. It's in the weeds, confused cockpit, defective instrument, full IFR stall recovery. Few of you understand the difference, but it's always amazing how many accidents result from ignoring the first rule of aviation: Fly the damn airplane. 3 thousand hours of autopilot operation does not get you the needed experience to hand fly the damn airplane, and that was the problem on Air France. The FO simply couldn't fly a partial panel, and he couldn't fly without over controlling. In fact, there was no real problem at all until the FO pitched the nose up for no good reason. In addition to all the other snowball factors.

Automation is great, but you still need solid basic aviation skills and a solid background in weather, icing, turbulence, manual instrument flying, instrument failure, engine failure, fire, the list goes on... because stuff breaks. Because some mechanic, refueler, baggage loader or avionics technician didn't leave things quite right. In the recent 737 crash, the ability and training to turn off the defective autopilot and fly the damn airplane would have saved that flight.

Turns out there is such a thing as too much automation, (Together with a false sense of security and woeful lack of needed experience) and it's a real shame so many had to die to make that point.
 

abc

Well-known member
Joined
Mar 2, 2008
Messages
5,811
Points
113
Location
Lower Hudson Valley
As a software engineer, my opinion have always been computers should focus on gathering the information and formulates a best guess. But human should always have the final decision.

Now keep in mind many human are very poor decision makers even with perfect information. So allowing the computer to actually execute the “best guess” may lead to better outcome on average. On that basis, I’m a firm believer of “computer as driver”, which I think will be a reality no matter how many people hate the idea.

Still, human should have the final say, by overriding the decision proposed by the computer if they so choose. That option must be made available so those few humans who do make above average decisions are allowed to shine over the mediocrity of the computerized decisions.

Every time I book a flight involving a regional operator, I cringe. Knowing how poorly trained SOME of those pilots are, I wish they make smart computer faster, so that pilotless planes will become a reality. On the major airline? I’d like the planes still allow pilots to fly them the old fashion way. Most of those pilots are often better than the computer. And they are at their best when the computer help without taking over.
 

Smellytele

Well-known member
Joined
Jan 30, 2006
Messages
9,917
Points
113
Location
Right where I want to be
As a software engineer, my opinion have always been computers should focus on gathering the information and formulates a best guess. But human should always have the final decision.

Now keep in mind many human are very poor decision makers even with perfect information. So allowing the computer to actually execute the “best guess” may lead to better outcome on average. On that basis, I’m a firm believer of “computer as driver”, which I think will be a reality no matter how many people hate the idea.

Still, human should have the final say, by overriding the decision proposed by the computer if they so choose. That option must be made available so those few humans who do make above average decisions are allowed to shine over the mediocrity of the computerized decisions.

Every time I book a flight involving a regional operator, I cringe. Knowing how poorly trained SOME of those pilots are, I wish they make smart computer faster, so that pilotless planes will become a reality. On the major airline? I’d like the planes still allow pilots to fly them the old fashion way. Most of those pilots are often better than the computer. And they are at their best when the computer help without taking over.
How do the experienced pilots get their experience?
 

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
As a software engineer, my opinion have always been computers should focus on gathering the information and formulates a best guess. But human should always have the final decision.

Now keep in mind many human are very poor decision makers even with perfect information. So allowing the computer to actually execute the “best guess” may lead to better outcome on average. On that basis, I’m a firm believer of “computer as driver”, which I think will be a reality no matter how many people hate the idea.

Still, human should have the final say, by overriding the decision proposed by the computer if they so choose. That option must be made available so those few humans who do make above average decisions are allowed to shine over the mediocrity of the computerized decisions.

Every time I book a flight involving a regional operator, I cringe. Knowing how poorly trained SOME of those pilots are, I wish they make smart computer faster, so that pilotless planes will become a reality. On the major airline? I’d like the planes still allow pilots to fly them the old fashion way. Most of those pilots are often better than the computer. And they are at their best when the computer help without taking over.

To go back to the example of driving instead of flying, "better outcome on average" doesn't work for me personally but you agree that drivers like me should be able to override computer automation. Been driving 45 years with zero accidents. Mostly because I'm attentive and have been able to take corrective action when needed. So to me it would be a big step down in outcomes and this is usually the case when you dumb things down to account for the below average participants. I don't want to be treated that way. Besides, I also enjoy driving so why should I give that enjoyment up because other drivers don't like driving or suck at it?

I do agree with what you have written in general.
 

JimG.

Moderator
Staff member
Moderator
Joined
Oct 29, 2004
Messages
11,989
Points
113
Location
Hopewell Jct., NY
Not quite.

Both accidents are related to cockpit confusion on what the automated system was doing, coupled with sensor failure. Boeing took it further with a stronger mandatory stall response.

With Air France, the FO's low experience, and almost zero real world non-automated experience hindered his recognition and recovery from a stall. There was no crew coordination. No one said "My Airplane", which essentially means get your effing hands off the joystick. And as the Vanity Fair article points out if you read far enough, the inherent danger of non replicated movements for both joysticks, ie they could each provide differing inputs, was an accident waiting to happen. I fault the FO for 1) over controlling, a sure sign of low experience, 2) not discussing his control inputs, 3) Not understanding the correct assertions of the NFP FO #2, and lack of assimilation of data through other instruments. I didn't see it in the article, but those sensors are always heated, and I didn't see any discussion of whether the heat was turned on. Sorta important.

This isn't garden variety training "Now we're going to do a stall demonstration" stall recovery. It's in the weeds, confused cockpit, defective instrument, full IFR stall recovery. Few of you understand the difference, but it's always amazing how many accidents result from ignoring the first rule of aviation: Fly the damn airplane. 3 thousand hours of autopilot operation does not get you the needed experience to hand fly the damn airplane, and that was the problem on Air France. The FO simply couldn't fly a partial panel, and he couldn't fly without over controlling. In fact, there was no real problem at all until the FO pitched the nose up for no good reason. In addition to all the other snowball factors.

Automation is great, but you still need solid basic aviation skills and a solid background in weather, icing, turbulence, manual instrument flying, instrument failure, engine failure, fire, the list goes on... because stuff breaks. Because some mechanic, refueler, baggage loader or avionics technician didn't leave things quite right. In the recent 737 crash, the ability and training to turn off the defective autopilot and fly the damn airplane would have saved that flight.

Turns out there is such a thing as too much automation, (Together with a false sense of security and woeful lack of needed experience) and it's a real shame so many had to die to make that point.

This is a great post. And it is a terrible shame.
 

abc

Well-known member
Joined
Mar 2, 2008
Messages
5,811
Points
113
Location
Lower Hudson Valley
How do the experienced pilots get their experience?
In the “old days”, which is now but quickly passing, a lot of the commercials pilots came from the air force, where they actually flew planes. But going forward, I don’t know.

By the way, I’ve seen this in the computer industry itself too. I grew up writing programs interacting with the computer itself. But today’s crops of software engineers often write programs interacting with some intermediate layers (JVM, or worse Websphere for example). Both kind of programs work as intended when everything is running fine. But when shit hits the fan, most of those who only interact with the intermediate layers don’t have a clue how to feal with a system wide melt down!
 
Top