As I noted in my previous column, the topic of the human factor in industrial incidents has been covered extremely well in a book,The Human Factor: Revolutionizing the Way People Live with Technologyby Kim Vicente. Vicente is a professor of human factors engineering at the University of Toronto and a consultant to NASA, Microsoft, Nortel Networks and many other organizations. He spends his time in emergency rooms, airplane cockpits and nuclear power station control rooms -- as well as in kitchens, garages and bathrooms -- observing how people interact with technology.



As I noted in my previous column, the topic of the human factor in industrial incidents has been covered extremely well in a book,The Human Factor: Revolutionizing the Way People Live with Technologyby Kim Vicente. Vicente is a professor of human factors engineering at the University of Toronto and a consultant to NASA, Microsoft, Nortel Networks and many other organizations. He spends his time in emergency rooms, airplane cockpits and nuclear power station control rooms -- as well as in kitchens, garages and bathrooms -- observing how people interact with technology.

He covers the wide range of activities where neglect of the human factor brings troubles, from minor irritations to massive preventable death tolls. Hospitals and healthcare stand out for critical attention. At the other extreme, the aviation industry is shown to be exemplary in its drive to learn from experience and make its database of cases available to all.

If you are in the field of industrial processing that this column addresses, you could not do better than to follow the work done by researchers into the human factor. I have no doubt that there are people alive today, unaware that it is because somebody read this book and acted on it.

Among the many examples, Vicente details how the pilots of certain types of military aircraft, after landing, sometimes accidentally retracted the landing gear instead of the flaps. The two controls were beside each other and identical in appearance. On then-existing aircraft, it was not possible to physically separate the two controls. For the immediate fix, the landing gear control knob was modified to resemble and feel like a tire. The flap control knob was modified to resemble a wedge, corresponding to the shape of a flap. Pilots stopped retracting the landing gear and this so-called “pilot error” was eliminated.

At the Three Mile Island nuclear power station, the large number of awkwardly placed and uninformative gauges in the control room left even highly trained operators uncertain as to the status of the reactor. Vicente describes the eleven-step procedure required to check reactor safety using steam tables and great mental agility. These factors all contributed to the accident there.

Leo Beltracchi worked for the U.S. Nuclear Regulatory Commission, but his responsibilities did not include designing control rooms. However, he came up with a display of the entropy-temperature curve that took in all eleven steps and at one glance could reveal whether or not there was a threat to safety. The design concept has been extended and widely adopted.

Vicente broadens the definition of technology that we humans are facing. It takes in the many non-physical aspects such as work schedules, team coordination, supervision, staffing, information and regulations. This is where the human factor must be engaged to complement the hardware and equipment technology and make it safe and effective.

TWA Flight 514 was flying at night into Dulles Airport in bad weather. It hit high ground and several people were killed. Six weeks earlier, a United Airlines flight had barely escaped hitting the same high ground. Both crews had misunderstood the same type of instructions issued by air traffic controllers. Although the United Airlines crew had reported the incident to their airline and a reporting system was in place, no warning had reached the unfortunate TWA crew.

The FAA at that time was responsible for receiving information about near misses and reprimanding anyone who could be blamed -- a deterrent to anyone wanting to file a report.

After the TWA accident in 1974, the FAA asked NASA to address the reporting situation. Charles Billings, an ex-NASA man, played a major role in setting up the independent Aviation Safety Reporting System (ASRS) under NASA -- not under the FAA. It was staffed with aviation specialists who analyzed the reports and set up a database available to anyone who has an interest in aviation safety. Reporting was voluntary and confidential, and analysts could contact the contributor with followup questions. Contributors were not exposed to any punitive or legal sanctions.

Billings reports that people write pages of descriptions, send tapes and volunteer to go to the ASRS offices and often establish close friendships with specialists there. The overwhelming majority of reports relate to the bad fit between people and technology in its broader definition. This system stands in stark contrast to the shame-and-blame system in the hospital scene that invites concealment.

When you read this book, you will sense Vicente’s competence, warmth and concern for humanity. PH



Links