It's not like we're talking about some wild-ass rare event here. You have a plane that's out of control after you've been pulling back hard on the stick while an audible warning tells you exactly what the problem is, and two pilots (for the most part) basically went deer in the headlights. I accept that real fear can change how you react, but these guys are supposed to be experts at flying planes- even the most junior guy should be an expert at basic flying skills. Every pilot knows the catastrophic consequences of a stall, so you'd think this would be as ingrained as "BP drops, unleash the fluids" would be to an anesthesiologist. Or "see traffic slow ahead, take foot off accelerator" to a driver. When they were all confused whether the plane was going up or down did anyone think to look at what the altimeter was doing for an answer? This event is a huge embarrassment to Air France and pilots in general. Airbus deserves a beating for designing a system that averages inputs from the control sticks.. JesusH.
I don't really agree that this wasn't a wild-ass rare event -
The pilots were operating under the assumption that it was impossible to stall the aircraft via manual flight controls. The controls aren't physically connected to the control surfaces, and inputs to the controls are moderated by the computer. As aircraft get more complex, flight computers do more of the work and pilot inputs get more and more general and abstract. On the extreme end, some military aircraft can't fly at all without the computer doing its work. When learning to fly by instruments, pilots are specifically trained to trust the instruments and disregard their senses (one area where the aviation/anesthesia parallels really diverge, as we
always can fall back on 'look at the patient').
The article said that not one of the Airbus 330s operated by US Airways has
ever been in the "alternate law" that would permit pilot input to create a stall. I can understand how the pilot could get locked into one intervention, believing that his action couldn't possibly be causing the problem because his most basic assumption (I can't stall this plane with manual input) was wrong.
Should he have known the plane was in alternate law and understood the implications and figured it out? Of course. All the pilots who ran the simulator following the accident did, but dealing with a vaguely expected crisis in simulation isn't the same thing.
I think the core of the error was tunnel vision and faulty assumptions about what 'should' be happening and what was possible, and there's absolutely room for similar types of human error to occur in our field.
If Airbus deserves a beating, I'm more inclined to question why the plane's software didn't automatically revert to "normal law" as soon as the pitot static system de-iced and resumed normal function, presumably as they were plummetting through 10 or 20,000 feet and had time to recover. Also hard to understand why left-seat control inputs didn't have some kind of tactile feedback to the right-seat controls.
I once made a medication swap error that is totally, completely unexplainable and inexcusable in retrospect. Ridiculous. The patient was slow to emerge, I thought I'd given too much narcotic, and reached for the naloxone vial. Instead I picked up flumazenil. I remember clearly thinking, 'huh, they must've changed suppliers' because the vial size, shape, color was different. I looked at the label. I drew up the flumazenil, diluted it to 10 mL, and over about 10 minutes gave 1 cc at a time, trying to gradually reverse the morphine or dilaudid or whatever it was I'd given the patient too much of. I labeled the syringe Narcan. I gave 1/2 the syringe, puzzled that the patient was still asleep. I still don't really know how I screwed that up ...
I just saw what I expected to see when I picked up the vial and looked at the label. I think I was a CA2 at the time (not that anyone should ever make this mistake, ever). I try to remember that error when I'm doing my routine scans and dealing with unexpected events.