This is an archived page and is no longer updated.
Please visit our current pages at https://rvs-bi.de

Talking to Newspapers: A Cautionary Tale with Moral

Peter B. Ladkin

Report RVS-J-99-01

Contents

The Story

On 27th June, 1999, the Sunday Times printed a report entitled "Faulty computers blamed in `pilot error' jet crashes" in which they tried to establish (the first paragraph):
Some of the worst air disasters in the past decade initially attributed to pilot error or terrorist attack are now being blamed on faulty computers and flawed software.

Are they indeed? Certainly not by any professional I know. But I had been interviewed for this article and it contained (correct) quotes from me. However, it also contained a paragraph describing a "study" I had done which apparently supported the thesis proposed by the article.

I have done no such study; and the claims said by the Sunday Times to be results of that study are in fact false. The Sunday Times published a "correction" on 11th July, which is itself incorrect.

The purpose of this note is to put the record straight and to tell the story. People have also asked why I talk to the press anyway, so I have also added my views about public risk management and the role of the press in it.

I am very put out by the misrepresentation of my work contained in the Sunday Times article. It has cost me about 50 hours of my time so far, and a certain amount of money for legal advice, because it was picked up by German, Austrian and Australian newspapers, as well as probably others I don't know about, and at least one electronic professional discussion group.

What Did The Sunday Times Say?

The offending paragraph:
Ladkin has carried out a study of air disasters in the past 10 years using the final investigation reports and new analyses by computer experts. It shows that some accidents first blamed on human error or even terrorists were caused or partly caused by computers. Software flaws and computer errors contributed to at least 30 accidents and potentially dangerous incidents in the past decade, according to Ladkin's study.

What Is True?

The definition of accident is fatal or serious injury to persons (crew, passenger, or on ground), or substantial damage (measured usually in amounts of money to fix or replace) to aircraft, or both.

I have been informed by reliable sources that they know of specific incidents in which computer malfunction was the cause or was amongst the causal factors. Without doubting the reliability of the source, I treat such information as professional hearsay until such time as I know details.

The "Correction"

On 11th July, the Sunday Times published a "correction":

Professor Peter Ladkin, of Bielefeld University, Germany, has asked us to clarify our report (News, June 27) in which we said his study showed "software flaws and computer errors contributed to at least 30 accidents and potentially dangerous incidents in the last decade." Although these were partly attributable to computer design, his studies do not show they were caused by a malfunction.

Note they misquote themselves. Note also they spend 10 words saying who I am, 36 words repeating themselves, and only 19 attempting to correct their mistake. Readers are invited to compare this "correction" with what I actually said (above). The Sunday Times also did not share with me the exact wording of the correction before publication, because it is not their policy so to do.

The Consequences: Germany

The London Auslandsdienst for the Axel Springer newspaper company read the Sunday Times article and wrote a brief in German based upon it. This was picked up by the Hamburger Abendblatt, rewritten, and run on Monday 28th June, as it was also by the Bild newspaper, Germany's largest. Both are Springer companies. The Abendblatt article was faxed to me by a reader who disagreed with what was said therein and wished to make his view known to me. I guess he was moderately surprised, but also relieved, that I agree with him. Thank you Herr Klaus Mathies.

That started a round of negotiation with the Abendblatt and Bild, which ended with both published a correction that we all could live with. This process was undoubtedly helped by the German legal Right of Reply, or Gegendarstellung. One has a right to reply, with the same prominence (font, size, positioning) as the original, in which one states what was falsely said, and then what is correct. It is very precise and circumscribed; one may not complain about what is implied, about the likely interpretation or suchlike, but only about what was literally said. And one must literally correct exactly these sentences. Since each paper had said three incorrect things, I knew I had six sentences to bargain with when the negotiations got tough ("Wrong:..., Right:...." times three). That helped!

The Abendblatt solved the problem through a reader's letter, the Bild through a correction. Here are my rough translations of the texts.

Hamburger Abendblatt

The original article in the Hamburger Abendblatt, 28.06.1999:
Accidents due to Computer Errors Mostly after aircraft accidents one hears: Pilot Error! Now a German study has come to a surprising conclusion: many of these accidents were put down to computer errors (Computerfehler). That was concluded by Peter Ladkin, Professor for Computer Networks at the University of Bielefeld. Horrifying: at least 30 accidents in the last decade were caused by software errors. Amongst those was the accident in Nagoya (Japan), in which 264 people lost their lives. The autopilot had tried to hinder [or `prevent': verhindern PBL] the landing of the Airbus A300 of China Airlines. The struggle between pilot and computer led in 1994 to a crash. [The following makes little sense, but is true to the original. PBL] One of the two pilots had mistakenly triggered the button of the autopilot, which was thereby switched on and attempted to gain altitude instead of landing. As the pilot attempted to descend with the control yoke, the computer set the rear control surfaces ["Heckflosse"], so that the jet climbed upwards at an extremely sharp angle and finally crashed. Ladkin: "Computers can control aircraft more accurately than people, but when things go wrong, it ends in a catastrophe".
Footnote 2

Reader's Letter, Peter Ladkin to Hamburger Abendblatt, 08.07.1999:

Following an article from the Sunday Times, you wrote: "Mostly.... one hears: Pilot Error!... A German study (shows): many of these accidents were put down to computer error." You attribute this to me. I haven't ever made any study with such results. Further, you say: "At least 30 accidents in the last decade were caused by computer errors [sic]". I don't know of any commercial airline accident in the last ten years that was caused by "computer error". I do know of some, amongst whose causal factors was the unfortunate interaction of pilot and automation. Furthermore, some others can be put down to the design of the automation. Others remain as yet unclarified. In every accident, there are many causal factors that play a role. Our study of the official report of the Nagoya accident shows 30 fundamental causes. From these 30 causes, only about 3 have to do with the design of the highly-automated cockpit.
Prof. Peter Ladkin, University of Bielefeld, Faculty of Technology.
Footnote 3

Bild

Bild, 28.06.1999:
Aircraft Accidents Investigated: 30 Times the Computer Was At Fault. When aircraft crash, oftentimes it has to do with pilot error. But not always. At least 30 airplane crashes in the last 30 years were found to be caused by flaws in the complicated software. That was found by a study of Professor Peter Ladkin at the University of Bielefeld. One of the cases he studied: the cras of the Airbus A300 in Nagoya (Japan), in which 264 people died. The pilot was attempting to land und "struggled" against the autopilot, which wanted to fly the airplane upwards in the air. In the end the aircraft crashed.
Footnote 4
Correction, Bild, 09.07.1999:
Concerning "Aircraft Accidents Investigated: 30 Times the Computer Was At Fault" (BILD, 28.06.1999) Professor Peter Ladkin has clarified that there is no study of his which comes to this conclusion. Rather: in the years 1993-1998 there were cases in which one of the causes was either the design or the interplay between pilot and automation.
Footnote 5

German and Sunday Times Reactions Compared

The Abendblatt gave me about as much space to correct as they did to the original article, which was very gracious of them. The same reader, Herr Mathies, also faxed me the letter, which is evidence that the correction reached the desired audience.

Bild published a correction similar to that I had suggested to the Sunday Times; but they used fewer words than the Times, and also they got it right.

Both of these newspapers agreed with me about the entire wording before publication. The Sunday Times told me it was their policy not to do this.

Bild is a `tabloid'; nevertheless, their entire interaction with me, including their concern about having (inadvertently) misrepresented me, was straightforward and responsible, as was that of the Abendblatt. I leave the comparison with the Sunday Times to the reader.

Other Newspapers

Articles based on the Sunday Times article also appeared in the Austrian newspaper Kurier in Vienna, and in The Australian in Sydney. I had the opportunity to say what the facts were on Austrian television, ORF. I haven't deal with the australian paper yet.

Classification of Factors

A motto of accident investigation holds that every accident is different. But similarities have to be found if we are to have any hope of learning lessons to apply to the future. I discuss the classification of causal factors of accidents in moderate detail in Classification of Accident Factors, (RVS Group Research Report RVS-Occ-99-02) and repeat only the conclusions here. I consider computer-related factors pragmatically to be classified into

Some readers might prefer to lump requirements errors together with design error as, for example, design/requirements error, or development error or, as some colleagues prefer to call it, system engineering error.

There are undoubtedly finer classifications to be sought, and one can argue (as many colleagues have with each other, regularly) for the worth or lack of worth of finer or coarser distinctions. So be it.

From this list, the one kind of relatively prevalent (as these things go) error with digital automation is mode confusion. Some other types of error are well-known from the pre-digital age (slips, lapses, mistakes) but have different manifestations with digital equipment. It could be argued that mode confusion arises from the design (namely, with modes), which itself arises from the complexity of the functions implemented, and not from anything expressly digital. This may well be. However, mode confusion comes with digital automation, and wasn't prevalent in aviation contexts before. That's makes it computer-related.

Finally, one should distinguish between error and infelicity. An error has no redeeming features, and is always unwanted. Whereas, an infelicity is a feature which may have positive influence in certain circumstances, but which had a negative outcome in the case under consideration.

As an example of infelicity in a well-known case, consider the weight-on-wheels (WoW) logic for allowing commanded thrust reverse to be deployed. The main gear must be out, and compressed (supposedly by the weight of the aircraft on the ground) in order to allow the thrust reversers to be activated as commanded. Such logic is commonly implemented in aircraft nowadays, partly as a result of what is believed to be thrust reverser activation in flight in the Lauda Air B767 accident in Thailand in 1991 (the reversers in that case had a mechanical inhibitor, which is presumed to have failed when an electrical fault seemed to have commanded thrust reverse in flight). However, such logic contributed to the delayed braking during the 1993 Warsaw A320 accident. So it is an infelicity - good for Lauda, bad for Warsaw - rather than a design error.

It should be clear from this discussion that most accidents with computer-related features fall into the interaction infelicity category. Some arguably fall into the design error or requirements error categories (for example, the Ariane 501 accident).

The Moral Dimension Of the Incident

Colleagues of mine with experience have suggested that the main problem was talking to journalists in the first place. British journalists were singled out as particularly prone to not treating their sources with respect and not presenting derived information accurately. So why don't I follow the advice of my colleagues and keep my mouth shut?

First, because of my experience. I've been interviewed for articles in the `serious' press in four different countries and appeared on television in two, and the public had not been treated to false information with my name on it until this episode.

Second, because I believe I have an obligation to share information. My attitude to risk assessment leads me to conclude that people with expertise in issues of commercial air safety have a duty to distribute their knowledge amongst the general public. Talking to the press and broadcast media is the only effective way to do this of which I know.

Nevertheless, if there should be be a significant likelihood that false information will be distributed as a result, at a cost to oneself of potentially a few hundred pounds per episode, then considerations of financial and professional survival may well take precedence. But such a situation should not be tolerated by anyone.

The public good is not served by distributing false information, or by withholding information one has, on matters of public concern. It is immoral to do the first, and newspapers and journalists who do it should stop doing it right away. When one has information whose distribution would serve the public good, and has an opportunity to distribute it, it is also arguably immoral not to do so.

The Sunday Times distributed false information, based on an interview with me, during which they were given only correct information. Such behavior influences those of us with some expertise to decline requests to share what we know. This is disadvantageous to the public good.

Risks, Choice and the Public Good

It is widely held, at least in US risk management circles nowadays, that those affected by risks should have an effective (though not necessarily decisive) voice in their management (see, for example, the National Research Council report Understanding Risk: Informing Decisions in a Democratic Society, Washington, DC: National Academy Press, 1996; and K. S. Shrader-Frechette, Risk and Rationality: Philosophical Foundations for Populist Reforms, Berkeley, Los Angeles, Oxford: University of California Press, 1991). For this voice to be effective in risk management, the participants must have access to accurate risk identification and risk evaluation information. (Risk identification, evaluation and management are the three stages in risk assessment, according to e.g., Shrader-Frechette, op. cit.).

Commercial air transport is a public transportation system used by an increasing fragment of the populace of developed countries, and whose accidents also affect the communities within which they take place geographically. A significant proportion of citizens are thus involved in commercial air transportation risks. According to the view of risk management cited above, these citizens should have an effective voice in management of these risks.

One may only have an effective voice in management so far as one understands what the components of the risk are. Ignorance and misunderstanding are not conducive to effective technology management.

Citizens are mostly informed through newspapers and broadcast media, not by, for example, taking university courses from people like me.

Therefore I take the following position concerning management of risks in more-or-less-public goods. Those of us who believe we have information important for effective decision making should share it in the most effective way possible with those affected by the risks; in this case, the general public. So we had better talk to the press. Furthermore, the press has an obligation to inform the citizens. This also means they have an obligation to get it right.

Conclusion

I'm disgusted with what the Sunday Times did. I think everybody else should be too. The Sunday Times should be ashamed of itself. Furthermore, it should share the correct information with its readers.

Peter Ladkin


Footnotes

Footnote 1: That is, unless one wishes to make the very tentative claim that that happened with the Nagoya accident. One could attempt to justify such a claim in one of two ways: In short, the only ways I know in which one could attempt to justify such a view concerning the Nagoya accident rest on equivocation.
Back

Footnote 2:

Abstürze wegen Computerfehler
SAD London - Meist heißt es nach Flugzeugzbstürzen: Pilotenfehler! Jetzt kam eine deutsche Studie zu einem überraschenden Ergebnis: Viele Flugzeugabstürze wurden durch Compterfehler ausgelöst. Das fand Peter Ladkin, Professor fü Computer-Netzwerke an der Universität Bielefeld, heraus.

Erschreckend: Mindestens 30 Unglücke in den vergangenen zehn Jahren wurden durch Software-Fehler verursacht. Darunter auch der Absturz in Nagoya (Japan), bei dem 264 Menschen ums Leben kamen. Damals hatte der Autopilot versucht, die Airbus-Landung A300 von China Airlines zu verhindern. Der Kampf zwischen Pilot und Computer führte 1994 zu einer Bruchlandung. Einer der beiden Piloten war aus Versehen an den Knopf des Autopiloten gekommen, der dadurch eingeschaltet wurde und versuchte, an Höhe zu gewinnen, statt zu landen. Als der Pilot versuchte, den Jet mit dem Höhensteuer herunterzubekommen, vestellte [sic] der Computer die Heckflosse, so daß der Jet in einem extrem steilen Winkel nach oben flog und schließlich abstürtzte. Ladkin: "Computer können Flugzeuge zwar akkurater steuern als Menschen, aber wenn etwas schifläuft, kommt es zur Katastrophe."
Back

Footnote 3:

In Bezugnahme auf einen Bericht aus der "Sunday Times" schreiben Sie: "Meist heißt es ... Pilotenfehler!...eine deutsche Studie (zeigt): Viele Flugzeugzbstürtze wurden durch Computerfehler ausgelöst." Dabei verweisen Sie auf mich. Eine Studie mit solchem Ergebnis habe ich aber nie gemacht. Weiter heißt es: "Mindestens 30 Unglücke in den vergangenen zehn Jahren wurden durch Computerfehler verursacht." Ich kenne kein Unglück von kommerziellen Fluglinien in den letzten zehn Jahren, das durch "Computerfehler" verursacht worden ist. Ich kenne allerdings einige, deren Ursachen teilweise im unglücklichen Umgang des Piloten mit de Automation liegen. Weiter sind einige im Design der Automation begründet. Einige bleiben unklar. Bei jedem Unfall spielen viele Ursachen eine Rolle. Unsere Studie des Unfalls von Nagoya zeigt nach dem offiziellen Bericht 30 Ursachen. Von diesen 30 Ursachen haben nur ungefähr drei mit dem Design des hochautomatizierten Cockpits zu tun.
Back

Footnote 4:

Flugzeugunglücke untersucht: 30mal war der Computer schuld
Wenn Flugzuege abstürzen, ist oft menschliches Versagen im Spiel. Aber nicht immer. Mindestens 30 Fluzeugunglücke der vergangenen 10 Jahre wurden durch Fehler der komplizierten Computersoftware verursacht. Das ergab eine Studie von Prof. Peter Ladkin von der Universität Bielefeld. Einer der Fälle, die er untersuchte: der Crash eines Airbus A 300 in Nagoya (Japan), bei dem 264 Menschen starben. Damals versuchte der Pilot zu landen und "kämpfte" gegen den Autopiloten, der das Flugzeug in der Luft halten wollte. Am Ende zerschellte das Flugzeug.
Back

Footnote 5:

Korrektur
Zu: "Flugzeugunglücke untersucht: 30mal war der Computer schuld"

Professor Peter B. Ladkin weist darauf hin, daß es keine Studie von ihm gibt, die zu diesem Schluß kommt. Sondern: In den Jahren 1993-1998 gab es Fälle, in denen das Design sowie der Umgang der Piloten mit der Automation eine der Ursachen für due Unglücke war.
Back