Jackson Pollock, Cathedral, 1947 (via Wikiart).
The ever-estimable Atul Gawande has published in The New Yorker this week the commencement address he delivered on 10 Jun 2016 at the California Institute of Technology. It’s worth a close read. Here are some excerpts. (But read the whole article/speech.)
‘Science is not a major or a career. It is a commitment to a systematic way of thinking, an allegiance to a way of building knowledge and explaining the universe through testing and factual observation.
The thing is, that isn’t a normal way of thinking. It is unnatural and counterintuitive. It has to be learned. . . .
The scientist has an experimental mind, not a litigious one. . . .
‘[E]ven where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is). . . .
‘[O]nce an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities. . . .
Despite increasing education levels, the public’s trust in the scientific community has been decreasing.
‘Science’s defenders have identified five hallmark moves of pseudoscientists.
- They argue that the scientific consensus emerges from a conspiracy to suppress dissenting views.
- They produce fake experts, who have views contrary to established knowledge but do not actually have a credible scientific track record.
- They cherry-pick the data and papers that challenge the dominant view as a means of discrediting an entire field.
- They deploy false analogies and other logical fallacies.
- And they set impossible expectations of research: when scientists produce one level of certainty, the pseudoscientists insist they achieve another.
‘. . . The evidence is that rebutting bad science doesn’t work; in fact, it commonly backfires. Describing facts that contradict an unscientific belief actually spreads familiarity with the belief and strengthens the conviction of believers.
‘. . . Emerging from the findings was also evidence that suggested how you might build trust in science. Rebutting bad science may not be effective, but asserting the true facts of good science is. And including the narrative that explains them is even better. You don’t focus on what’s wrong with the vaccine myths, for instance. Instead, you point out: giving children vaccines has proved far safer than not. . . .
‘The other important thing is to expose the bad science tactics that are being used to mislead people. Bad science has a pattern, and helping people recognize the pattern arms them to come to more scientific beliefs themselves.
Having a scientific understanding of the world is fundamentally about how you judge which information to trust. . . .
Knowledge and the virtues of the scientific orientation live far more in the community than the individual.
‘When we talk of a “scientific community,” we are pointing to something critical: that advanced science is a social enterprise, characterized by an intricate division of cognitive labor. Individual scientists, no less than the quacks, can be famously bull-headed, overly enamored of pet theories, dismissive of new evidence, and heedless of their fallibility. (Hence Max Planck’s observation that science advances one funeral at a time.) But as a community endeavor, it is beautifully self-correcting.
‘Beautifully organized, however, it is not.
Seen up close, the scientific community—with its muddled peer-review process, badly written journal articles, subtly contemptuous letters to the editor, overtly contemptuous subreddit threads, and pompous pronouncements of the academy—looks like a rickety vehicle for getting to truth.
Yet the hive mind swarms ever forward. . . .
Even more than what you think, how you think matters.
Read the whole article in The New Yorker: The mistrust of science, 10 Jun 2016.
Read other articles on the ILRI News Blog about how the use of scientific evidence to influence decision-making:
- Persuasion: Towards a calculus of influence in livestock research for development, 17 Feb 2016
- What we talk about when we talk about ‘evidence-based’ advocacy communications, 13 Jan 2016
- ‘We’re having all the wrong debates’–Tamar Haspel, 28 May 2015
- Realpolitik–Nairobi ‘Community of Practice’ communicators get real practice impacting policymakers, 7 Jul 2014