When is failure a good thing?

NY Times Review of Books

After more than a half-century of unraveling cognitive biases, the psychologist and Nobel laureate Daniel Kahneman feels deeply pessimistic about our ability to change our behavior. In a 2014 Guardian interview, Kahneman told the Harvard psychologist Steven Pinker that “the idea of human nature with inherent flaws was consistent with a tragic view of the human condition, and it’s a part of being human that we have to live with that tragedy.” Though acknowledging Kahneman’s influence on his own work, Pinker countered with a diametrically opposed view of the same material. “We have the means to overcome some of our limitations, through education, through institutions, through enlightenment.” His optimism that rationality will overcome our baser instincts is the underlying argument of his 2011 book “The Better ­Angels of Our Nature.”

Implicit in this exchange is the central dilemma of those popular psychology books that offer any degree of self-help. If science has dismantled the Enlightenment notion of “the rational man,” is Pinker’s optimism justified, or should Kahneman’s nihilism be our default view of the human condition? This problem becomes particularly acute when a book both outlines our deeply rooted behavioral inclinations and simultaneously suggests that they might be overcome. The better your argument for our inherent limitations, the weaker become your bootstrap suggestions for self-improvement.

This inescapable paradox is front and center in the title of the journalist ­Matthew Syed’s “Black Box Thinking: Why Most People Never Learn From Their Mistakes — but Some Do.” His core premise is that detailed independent investigation of our everyday screw-ups can prevent recurrences just as black box flight data analysis has dramatically reduced the incidence of airplane crashes. If more data creates a better outcome, good evidence-based analysis in all areas of human behavior should allow us to lift those blinders that prevent us from learning from our mistakes.

 

With a Gladwell-like ­journalistic verve and often moving case histories, Syed begins with the most obvious low-­hanging fruit — the failure-filled health care and criminal justice systems — to get readers up to speed. Much of the material will be well known to those with an interest in aspects of the cognitive sciences such as confirmation and selection bias, self-justification, cognitive dissonance and narrative fallacy.

Unfortunately, the book soon morphs into a cautionary tale against our ability to overcome innate biases. On several occasions Syed falls into the very trap that he claims some of us can avoid. For example, one of his pet peeves is the medical profession’s inability to police itself. He points out that the autopsy rate in America has dropped substantially in recent years, the procedure performed in less than 10 percent of deaths. Downsides include less-than-accurate cause-of-death statistics (which affect government research funding), as well as missing medical errors that would be discovered only post-mortem. His facts are correct, but what should we conclude?

 

As you might expect, the decline in autopsies has a number of explanations, from greatly improved imaging techniques to health care containment costs and the constraint that autopsies are not routinely covered on health insurance policies. Syed’s single-minded interpretation: “It is not difficult to identify why doctors are reluctant to access the data: It hinges on the prevailing attitude toward failure. After all, why conduct an investigation if it might demonstrate you made a mistake?”

I am deeply sympathetic with his observations that the medical profession has historically relied on a combination of collegial protectiveness and an unjustified attitude of superiority to prevent outside scrutiny and regulation. However, times are changing — the public is more savvy, and many more doctors are challenging the old hierarchy. The Harvard physicians and New Yorker staff writers Atul Gawande and Jerome Groopman have offered nuanced dissections of the varied reasons for the persistence of medical errors. Gawande’s recommendation for routine checklists is now common practice. Rather than balancing his black box approach with a close look at the personal insights of such experts, Syed offers the following polemic. “It is noteworthy that the inability of senior doctors to embrace their flaws and weaknesses, indeed to admit that such things are even possible, is sometimes called a God ­complex.”

“Black Box Thinking” has undeniable merits. In a congenial, easy-to-grasp style, it introduces some of the major research in the field of cognitive flaws and points out that empirical method is superior to gut feelings and unconfirmed experience. Missing are any new insights or aha moments. Worse, Syed’s often monochromatic approach to complex issues serves in part to refute his main message. If he cannot recognize and rise above his own cognitive biases, to whom is the subtitle applicable?

In the Kahneman-­versus-Pinker face-off, “Black Box Thinking” is Exhibit A for Kahneman’s pessimistic point of view.

By contrast, “Failure: Why Science Is So Successful” is a breath of contemplative fresh air. Stuart ­Fire­stein, a professor in the department of biological sciences at Columbia University, is best known for his work on ignorance, including inviting scientists to speak to his students about what they don’t know. In a tone reminiscent of Lewis Thomas’s “The Lives of a Cell,” the book is a collection of loosely interwoven meditations on failure and scientific method. Fire­stein picks up an idea, gnaws on it, then examines it from as many different angles as he can, as in a series of easygoing chats.

He begins with a quotation from Gertrude Stein: “A real failure does not need an excuse. It is an end in itself.” Firestein elaborates: “Good failures . . . are those that leave a wake of interesting stuff behind: ideas, questions, paradoxes, enigmas, contradictions.” As a bench scientist, Firestein doesn’t subscribe to the commonly held story of science as incremental successes punctuated with a few brilliant leaps forward. For him great discoveries arise out of failed attempts that continue to provoke and puzzle. Success requires embracing failure.

 

His most compelling argument is derived from evolution. Out of a vast number of mutations (which can be seen as DNA failures), most have led to extinction, while only a very small percentage have unpredictably moved us up the evolutionary ladder. Contrary to any notion of a grand design, we are the work in progress of myriad genetic mishaps — with more to come. No one should expect great science to arise without a similar vast number of intervening failed steps.

The spirit of his book is reflected in his wise interpretation of Beckett’s famous “Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.” For Fire­stein, failing better means eschewing success when you already know how to achieve it, instead seeking out those areas where mysteries still reside. He encourages trying to fail because it is the only strategy to avoid repeating the obvious.

If we succeed by failing, then we should be freed from the monolithic road to academic tenure; science should be taught as an adventure in failure. With a delightful combination of feigned naïveté and keen eye for the messy ways that great discoveries occur, he goes so far as to suggest writing a grant proposal in which you promise to fail better. He knows this isn’t how the world works, but nevertheless argues that change will take place “when we cease, or at least reduce, our devotion to facts and collections of them, when we decide that science education is not a memorization marathon, when we — scientists and nonscientists — recognize that science is not a body of infallible work, of immutable laws of facts. . . . And that most of what there is to know is still unknown.”

Firestein speaks a larger truth grasped, if seldom publicly acknowledged, by most career scientists. If there is any justification for man’s ability to overcome his own limits of reason, “Failure” stands as a shining example.

The views expressed in Op-Ed pieces are those of the author and do not purport to reflect the opinions or views of Libyan Express.
How to submit an Op-Ed: Libyan Express accepts opinion articles on a wide range of topics. Submissions may be sent to oped@libyanexpress.com. Please include ‘Op-Ed’ in the subject line.
You might also like

Submit a Correction

For: When is failure a good thing?

Your suggestion have been successfully submitted

There was an error while trying to send your request. Please try again.

Libyan Express will use the information you provide on this form to be in touch with you and to provide updates and marketing.