On 2 July 2021, the 70th Lindau Nobel Laureate Meeting ended on a high note with the panel Why trust science? The panel hosted: Nobel Laureate Brain P. Schmidt, Chairman of the Nobel Foundation Carl-Henrik Heldin, Historian Hans-Jörg Rheinberger, and the Dallas Ft./Worth Living Legend Scholar of Cancer Research Balkees Abderrahman. The panel was moderated by the Chief Scientific Officer for Nobel Prize Outreach Adam Smith.
In order to answer the question: “Why trust science?”, we ought to first ask ourselves another question: “How does mistrust towards science evolve?”. By holistically answering the latter, using the principles of neuroanthropology (i.e., intersection of neurosciences, anthropology, social sciences, and philosophy), can we not only answer the question “Why trust science?”, but also transcend that into turning science skeptics into upholders, and science deniers into believers.
Motivated Reasoning: The Crux of Mistrust Towards Science
Each person or group has an emotional avatar, and a subconscious blueprint. The emotional avatar constitutes of their core values (i.e., issues that they care about including commercial and political interests), and pain points (i. e., issues that trouble them). Whereas, the subconscious blueprint constitutes of hundreds, if not thousands, of their mental models (or belief systems), amassed over a lifetime, from their environments and experiences. Subconscious mental models are potent because they provide an individual or a group with a subjective map of reality, which represents “a truth” to them, but not necessarily “the truth”, about how the world is and the people are. American Professor of biochemistry Issac Asimov described this notion well when he said: “Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won’t come in.”
When you approach an individual or a group that believes climate change is not real, with rigorous scientific research that says otherwise, you will trigger, what’s called in neuroscience, “motivated reasoning”. First to respond to your evidence is their subconscious mind (with rejection to preserve one’s identity), second emotions (with mistrust even hostility), and lastly the conscious mind. The subconscious mind accounts for 95 percent of your cognition and has a “homeostatic impulse”: the impulse to regulate the physical self (body temperature, heartbeat and breathing), and the mental self (belief systems or prejudices). On the other hand, the conscious mind accounts for only 5 percent of your cognition. American neuroscientist David Eagleman described the coexistence of emotion and reason as the brain’s democracy two-party system that is in civil war, with emotion being fast, automatic, and impulsive, but reason being delayed, analytic, and reflective.
This means that conscious reasoning (or what we consider our most logical and dispassionate conclusions) is motivated or skewed by our subconscious blueprint, and emotional avatar (called “affect” in neuroscience). In fact, “motivated reasoning” should be called “motivated rationalizing” instead, as one is not reasoning and rethinking in the face of new evidence, instead rationalizing to reinforce preconceived notions.
The byproducts of “motivated reasoning” are the confirmation bias and the disconfirmation bias. Confirmation bias is “cherry picking” or picking the bits of information that bolster our belief systems. Whereas, disconformation bias is our willingness to expend an extraordinary amount of energy to refute the bits of information that don’t align with our belief systems.
“Motivated reasoning” elucidates why issues that are deeply researched, such as climate change and vaccination, are deeply polarizing.
Stanford psychologist Leon Festinger described the individual with an ingrained emotional avatar and subconscious blueprint, as a “Man with a Conviction” who is “a hard man to change.” “Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to his logic and he fails to see your point.” Festinger went a step further, and evaluated what happens to a cult that is emotionally invested in a belief system, when it is soundly refuted? To his amazement, the devastation further reinforced their belief system. That’s how powerful the subconscious mind is, in skewing the process of conscious reasoning.
Yale Law Professor Dan Kahan constructed a cultural classification that explains why, in each society, two groups are opposed over scientific evidence. Kahan classified individuals, based on their cultural values, into “individualists” and “communitarians”. Individualists are conservative groups (e. g., Republicans) who are pro-commerce, one’s right to possess a gun, and are “system justifiers”, who heavily engage in “motivated reasoning” to defend the status quo that they are part of, and propagate their commercial and political interests. Whereas, communitarians are liberal groups (e. g., Democrats), who are anti-free market and patriarchal families, and believe that guns in the hands of individuals have a spillover effect onto society. Researchers discovered that a key predictor if Americans will accept the science of global warming or not, is whether they are a Republican or a Democrat!
Whilst “motivated reasoning” might be rampant in conservative groups, liberal groups are not immune to it. Conservative groups might be anti-climate change, but liberal groups are anti-vaccination in connection with autism. In fact, everyone is susceptible to “motivated reasoning”, and falling in love with their ideas or belief systems, including scientists. The 17th century theorist of the scientific method, Francis Bacon, dubbed this “idols of the mind”.
The truth is Homo (i. e. “humans” of either sex) Sapiens (i. e. “wise”) are not as wise as they might think. They apply fight-or-flight reactions not only to predators, but also to data that doesn’t align with their subconscious belief systems. People don’t mistrust science per se, they mistrust what science provides, what contradicts their belief system. Research showed that people trust science up until it contradicts religion; then they side with religion. Panelist Heldin remarked that “belief is related to religion, [whilst] trust is related to science”.
In The Demon-Haunted World: Science as a Candle in the Dark, Carl Sagan said: “For me, it is far better to grasp the universe as it really is, than to persist in delusion, however satisfying and reassuring.” But, if we are all susceptible to “motivated reasoning”, and our subconscious biases are commandeering our behaviors, how can we practice what Carl Sagan recommended? How do we “kill [our] darlings”, according to Nobel Laureate William Faulkner, who coined this phrase in reference to ruthlessly eliminating the elements of your story (or life) that you have worked hard for and love, but don’t amplify (or serve) your overall mission?
How to Meet Your Waterloo: a Toolbox to Avoid Motivated Reasoning and Masterfully Communicate Science
In 1815, during the Battle of Waterloo, the British-led coalition under the command of the Duke of Wellington, defeated the insurmountable French army under the command of Napoleon Bonaparte. The phrase “meeting one’s Waterloo” is symbolic of meeting a problem that is too difficult to surmount (like motivated reasoning), and yet overcoming it. Just like Wellington’s coalition of units from the United Kingdom, the Netherlands, Hanover, Brunswick, and Nassau, we will deploy neuroscience and optimal psychology-inspired tools to convince even the toughest crowds to trust science.
Tool 1: Don’t Lead …
… with the facts to convince. Instead, lead with the individual or group’s core values, to give the facts a fighting chance. Here is a formula that I propose to package your scientific message: (content * communication) = porosity. Porosity = people being receptive to science. Content = simplified, textured with findings and limitations, and open-sourced. Communication = leading with the target audience’s core values, acknowledging your respect for them and their principles, using a language that doesn’t provoke any emotional defensive responses or “motivated reasoning”, and embracing emotional variety (i. e., people can feel anxious at the beginning of a conversation, then intrigued, then angry, then interested, then surprised to have learned something new).
One can even deploy religious, political and business figures of the target group to mediate science communication. The more creative your content and communication are, the more porosity you will yield, and the more trust towards science you will generate. Dan Kahan and colleagues evaluated how slightly changing the communication language impacts the target group’s receptivity to science. They packaged the basic science of climate change into fake newspaper articles with two very different headlines: ”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”. The second article was more well-received among the “individualist” groups, as it’s headline lead with their core value of nuclear power.
Tool 2: Walk Away …
… from the mindset that more research is needed and more people need more education. Researchers found that Republicans and Democrats are equally knowledgeable on science. But 54 percent of Democrats support scientists in making decisions about scientific policy issues, while 66 percent of Republicans think the opposite.
Tool 3: Walk Into …
… the mindset that even experts have valleys in understanding that are others’ mountains or nobody’s mountains. Share that your work, no matter how vigorous or comprehensive it is, is simply a piece of the puzzle that serves to refute older theories, or introduce a new one that is correct until proven otherwise by future research. Panelist Schmidt emphasized that “science is the only philosophy that makes predictions very well” whilst panelist Rheinberger remarked that “scientists should swing back and forth like a pendulum, between producing epistemic things, and acknowledging their fallibility. This is in the footsteps of Nobel Laureate François Jacob who championed keeping a sense of fallibility in all experiments and observations”. Moderator Smith added that we must become “comfortable understanding that [we] don’t yet understand”.
I would propose using the term “scientific understanding” instead of “scientific truth”, because our “scientific understanding” of the “scientific truth” is always in motion: getting refuted or expanded. Roman Emperor Marcus Aurelius perhaps best described this notion when he said: “everything we hear is an opinion, not a fact. Everything we see is perspective, not the truth” This approach was shown to breed trust and persuasion during science communication, and, most importantly, such humility doesn’t trigger “motivated reasoning”. Choose communicating a thought-provoking content over a thought-conclusive one.
Tool 4: Transition From …
… the mindset of monologue (i. e. talking at someone) into that of a dialogue (i. e. talking with someone). Transition from the mindset of confrontation (i. e. “us” versus “them” and “we know” versus “they don’t know”) into that of CARE-frontation (i. e. we want to finish the conversation with the relationship intact and it’s my responsibility as the science communicator to use (C*C) = P to persuade others to trust science, and not expect them to be persuaded by the facts alone. That’s why communication of any kind is both science and an art). Choose science over sides. Choose effect over ego.
Tool 5: Integrate …
… the lay public in the conversation and solution, since they are already part of the problem. It’s flawed to heavily target policy makers, politicians and authority figures when communicating science. The public remains our most important stakeholder, whom we should obsess over their core values and pain points, when curating our science content and communication. The public remains our boss, for whose benefit we work. The public remains our financier, who funds our research either directly or indirectly. Researchers found that 48 % trust doctors’ information as fair and accurate most of the time, compared with 32 % for research scientists. This difference lies in the level of engagement between the two. In policy making, choose testing your hypothesis by striking up a conversation.
Tool 6: Above all, Prioritise …
… personal mastery before other mastery. First, guard yourself against becoming a black hole for subconscious biases and falling prey to “motivated reasoning”, before you guard others. One powerful method to reprogram the parts of the subconscious mind that don’t serve you is habituation. If you want to remain centered on issues, and not get pulled to the extremes by culty cults, then make it a habit to read books or listen to podcasts by credible voices that you agree with as well as those that you don’t. But don’t just read; habituation is about “consistent experiencing”: feeling, reflecting and questioning. Choose perspective-seeking over perspective-taking.
The good news is that now, you have a toolbox to engage with skeptics and deniers with the end in mind of initiating and sustaining a civil conversation that promotes science. The best news is that there are many people who already trust science. In Think Again: The Power of Knowing What You Don’t Know, thought leader and organisational psychologist Adam Grant reveals that no more than 10 % of Americans are climate change deniers, and yet they receive the most coverage in the media, which discourages others from acting, on the assumption that they are up against a Napoleonic-sized resistance. This coverage of the few as many to influence society’s psyche is Machiavellian, but should only persuade us to influence the spark before it becomes a flame!