Resisting the Suppression of Science
Lisa Rosenbaum, M.D.
March 1, 2017 DOI: 10.1056/NEJMp1702362
All doctors encounter patients who express preferences for non–evidence-based therapies — organic food for coronary disease or detox cleanses for cancer, for example. Personally, I’ve never come up with an effective response. I offer facts, and then, sensing that I’m getting nowhere, I offer more facts. I blink rapidly to avoid rolling my eyes. Eventually, I resort to the “I statements” taught in medical school: “I understand that’s what you believe,” though my body language surely gives me away. Not surprisingly, I haven’t had much success in overcoming disbelief of science. And though many physicians may approach this challenge more skillfully one on one, as a scientific community, we often seem trapped in a similar dynamic. Whether it’s the science of vaccines, climate change, or gun control, we tend to endlessly emphasize the related evidence, and when that fails, exude a collective sense of disgust.
Now, a U.S. administration that has demonstrated dogged disregard for truth has raised concern not only that the clash between science and belief will intensify, but also that science might be frankly suppressed. President Donald Trump has called climate change a “hoax,” voiced skepticism about vaccines, and appointed as head of the Environmental Protection Agency a man who has fought against its mission. Members of one federal agency were allegedly asked to reveal their views on climate science, and other federal scientists reportedly face gag orders forbidding them to attend scientific conferences or communicate their findings. The Centers for Disease Control and Prevention (CDC) recently postponed a planned Climate and Health Summit, though that move may have been cautionary, meant to assuage an administration on which the agency relies for its funding.
In the face of suppression of science, should scientists resist, or quietly proceed with their work? Resistance seems essential. That the CDC postponement prompted a coalition to form and organize an alternative meeting (see article by Hunter et al.) reminds us that resistance is as much about ensuring effective dissemination of findings as about continuing to conduct science. But it’s critical to recognize that suppressing science does not cause disbelief; rather, disbelief, particularly of science pertaining to highly politicized topics such as climate change, creates a cultural environment in which suppression of science is tolerated. So the real question is how do we resist effectively? How do we convince a skeptical public to believe in science?
First, we need to stop assuming that disbelief necessarily reflects a knowledge deficit and can thus be remedied by facts. When doubt is wrapped up in one’s cultural identity or powerful emotions, facts often not only fail to persuade, but may further entrench skepticism.1 This phenomenon, often referred to as “biased assimilation,” has been demonstrated across a range of issues, from the death penalty to climate change to vaccines.2 One study found that parents hesitant about vaccinating their children became even less inclined to vaccinate when given information debunking the myth that vaccines cause autism.3 Somewhat counterintuitively, this tendency does not reflect lack of intelligence; in fact, when it comes to climate science, people who demonstrate higher levels of science comprehension are actually also the most adept at dismissing evidence that challenges their beliefs.1 Moreover, the propensity to dismiss evidence that threatens our identity or beliefs is nonpartisan: liberals, for instance, are far more likely than conservatives to dismiss science suggesting that genetically modified foods are safe. Even within the medical community, whether we’re debating mammography screening, statins, or the credibility of a drug-company–sponsored study, our ideologies affect our assimilation of data.
Second, in this highly polarized moment, we have to be careful not to inadvertently politicize science that has not already been pegged to a particular worldview. Dan Kahan, an expert on the way emotion and identity affect our interpretation of scientific facts, recently coauthored a study assessing how “culturally antagonistic memes” affected people’s ability to process information about an ostensibly neutral scientific issue: Zika virus.4 Because stories have circulated suggesting that Zika was caused either by global warming or by immigration, both highly charged topics, the researchers assessed how exposure to such stories affected subjects’ perceptions of the Zika threat. Those whose worldviews are associated with suspicion of climate science became more skeptical of the Zika threat when it was purported to be caused by global warming, and those whose worldviews tend to favor globalism and open borders perceived lower risk from Zika virus when its emergence was tied to immigration.
This risk of adding an identity-laden valence to otherwise neutral scientific matters makes resisting science denialism in the Trump era particularly tricky. Because we pay far more attention to contested than to generally accepted science, it’s easy to forget that most scientific facts, and related policies, don’t induce tribalism.1 You don’t see partisan battles over treatment for myocardial infarction, say, or the dangers of radiation exposure. But as Kahan points out, Trump thrives on making nonpartisan issues polarizing. The indication that he might appoint a vaccine skeptic to head a commission to review vaccine safety is a worrisome example, since vaccine skepticism has thus far been limited to a minority, albeit vocal, fringe. “I have never seen someone so aggressively intent on just increasing the number of issues that feature that sort of antagonism,” Kahan told me. “He is our science communication environment polluter in chief.”
Such polluters cunningly incite cultural battles that ultimately heighten distrust of science. Their strategies exploit a fundamental aspect of human nature: forced to choose between “recognizing what is known to science” and maintaining our group identity, most of us choose the latter.1
This constant quest for identity preservation helps explain why calling vaccine skeptics idiotic or dangerous is, as others have pointed out, likely to backfire, particularly as we face a cultural backlash against academic “elites.” It’s also why, when Trump issues an antiscience provocation over a nonpartisan subject, we should avoid being so strident in correcting misinformation that we further galvanize skepticism based on political identity alone. Even with already-polarizing topics, more measured resistance may be the most effective approach. To that end, circumspect resistance like the rallying of a coalition to relatively quietly reorganize the postponed climate-science meeting may end up being the most effective in these divisive times.
But measured resistance may feel unsatisfyingly hard to define. Given the many variables involved in any one threat to science — including the perceived identities at stake and the way the threat is executed — it’s hard to generalize about what the “right” response entails. With climate change, for instance, if our goal is environmentally protective federal legislation, maybe massive public protests like the Women’s March are necessary to generate the political will. Or maybe, as the behavioral economist Cass Sunstein has suggested, the best remedy for disbelief anchored in tribal allegiances is the identification of “surprising validators” — people willing to advocate for science who are trusted by any given group because of their shared identity.2 One recent example is a group of prominent conservatives who published a proposed policy for slowing global warming.5 But the reality is that we know far more about the challenges to communicating science than about how we might overcome them.
Yet perhaps there is a silver lining in the unmooring of many Americans by the widespread embrace of “alternative facts”: scientists are not alone in their determination to make the truth believable again. As a medical community, we have long approached the communication of science unscientifically. We are taught in medical school to make eye contact, nod our heads, and demonstrate cultural competence. But if the purpose of communication is to translate science into public policy that can improve the health of our population, then we ought to focus as well — and urgently — on empirically and effectively navigating assaults on truth.