Imagine, if you can, a book a copy editor would find more inviting than Joseph T. Hallinan’s Why We Make Mistakes (Broadway Books, 283 pages, $24.95). Mr. Hallinan, who formerly wrote for The Wall Street Journal, offers a catalogue of our predispositions to fallibility.
Because we — that’s us as a species — are impatient and have a misplaced confidence in our ability to figure things out, we don’t read the instructions. Subaru, for example, determined in study of customer complaints that one in five questions to the company’s call center involved a point explained in the owner’s manual.
We are too easily distracted and dangerously prone to the illusion of multitasking. Talking on cell phones while driving, or worse, texting, divides our attention dangerously. So do all the gadgets with which automobiles come equipped. An Eastern Airlines jet crashed in the Everglades because the flight crew got so distracted over determining why the landing gear indicator light hadn’t come on that they forgot to operate the airplane. The industry has a term, “Controlled Flight into Terrain,” CFIT, to describe just such occurrences.
We are overly reliant on our perceptions, which are partial, and our memories, which reconstruct rather than record. This is why eyewitness testimony in trials so regularly contributes to the conviction of innocent people.
And, copy editors should take particular note here, we defer too much to experts, who are in turn overconfident in their expertise. The medical industry, in an effort to reduce anesthesiologists’ errors in the operating room, has taken a number of measures, among them “attitude adjustment.” Pay close attention: “They began discouraging the idea of doctor as know-it-all, and encouraged nurses and others to speak up if they saw someone—especially the anesthesiologist—do something wrong. In error-speak, this is known as ‘flattening the authority gradient,’ and it has been shown as an effective way to reduce errors.”
Some of the scandals in the newspaper industry — the plagiarisms and fabrications, the articles based on shaky information — can be attributed to too steep an authority gradient. The copy editors and lower-level editors were either not encouraged to question the work of stars or decisions of the high command, or were ignored when they did.
One way to avoid error is to develop systematic defenses, like the checklists used in the airplane cockpit and the operating theater. The sorts of questions that copy editors ask are like those checklists: Is this right? How do we know this? Who says so? Do we have independent confirmation? Does this make sense?