I Reject Your Reality and Substitute My Own

Rex Saffer the AstroDoc
15 min readSep 23, 2023

Introduction

Large surveys conducted over the past decade indicate that most Americans believe that science has improved our lives, and that government subsidies of science return substantial benefits over time. At the same time, scientists express widely disparate beliefs on individual issues than the general public, especially in areas such as medicine, climate, and energy.

Some examples from survey participants[1],[2] asked to express an opinion:

1) 88% of members of the American Association for the Advancement of Science (AAAS) believe that it is safe to eat genetically modified foods vs. only 37% of the general public.

2) 87% of scientists believe that climate change is largely due to human activity vs. 50% of the general public.

3) 95% of scientists believe that modern humans evolved from more primitive ancestors, while 46% percent of Americans believe that God created humans (and everything else, for that matter) in their present form in a single episode within the last 10,000 years.

Why do we believe in science in general but not in specific scientific ideas or results, even when supporting evidence is presented to us? As Hamlet understood, “Aye, there’s the rub.”

Communicating the Science

There is a widespread lack of familiarity with the very nature of scientific inquiry, which can be divided broadly into Theory and Experiment. These are elements of what is called the “Scientific Method”. Click here for a document I give my students on Reporting Scientific Results. To summarize, theoretical and experimental investigations are not independent. Each provides information and feedback to the other.

The results of theoretical investigations are mathematical equations that explain physical phenomena and predict how the laws of nature should operate. In experimental investigations, scientists use instrumental apparatus to obtain quantitative measurements and compare outcomes with the predictions of theory. Both types of investigation are inescapably tainted, if that is the right description, by various forms of error.

By definition, the theories are incomplete, otherwise there would be nothing new or unknown to investigate. Theories produce definite, quantitative explanations of observed phenomena, as well as predictions of phenomena that have not yet been observed but ought to be if the theory correctly describes some aspect of nature. These results might well be incorrect due to conceptual or mathematical errors and must be compared against the outcome of laboratory experiments.

On the flip side, no experimental measurement can be made perfectly. As one example, there is always some uncertainty in the value of a length or mass measurement, either from limitations in the measurement instruments or miscalibration of them, from human error in using or reading the instruments, or from environmental factors that cannot be controlled. Scientific descriptions of uncertainty are always statistical in nature, further obstructing effective communication of results involving sometimes abstruse mathematical details.

Scientists go to extraordinary lengths to understand uncertainties and transparently communicate them to others in the scientific community and the general public. Done right, science is one of the most honest and forthright enterprises one can imagine.

Intuitive Beliefs & Conceptual Change

Several factors govern the development of core beliefs and the integration of new ideas and evidence into them.

1) We are hardwired to be adaptable and flexible learners, but we are programmed with certain relatively rigid belief systems from a very early age. It could not be otherwise. We learn from our parents and communities, and cognitive development in children takes time to reach a level where critical examination of alternative hypotheses is possible. Like any good tool, once acquired this nascent cognitive function must be honed by mentoring and practical application. I might be blessed with considerable innate musical talent, but if I do not get the appropriate training and put in hundreds or thousands of hours of directed practice, “Chopsticks” might be as far as I get.

2) It is a sad truth that our educational systems are not designed to promote development of higher cognitive skills. Frankly, most K–12 schools spend far too much time on standardized testing, and far too little on critical thinking. Even if there were more emphasis on cognitive development, what student in our era has the time for it? Kids today have so many extracurricular activities that they barely have time to eat, sleep, and do (some) homework. Their moms and dads are equally and frantically consumed with working to pay for it all and running them around from here to there and back again.

3) As mentioned above, scientific evidence is very rarely certain. Experimental outcomes are reported in statistical terms, as a likelihood or probability that the result will be found to lie between some lower and upper limits if the experiment is performed repeatedly. The problem is compounded by a general lack of familiarity with the fundamentals of statistics itself. There is a familiar saying, “There are lies, there are damned lies, then there are statistics.” These things can be learned, but it is uncommon to find them well developed in the general public.

4) Even if scientific evidence is acknowledged to be plausible or probable, most of us still find it difficult to cast aside old ways of thinking for new ones. It requires little effort to casually dismiss or ignore evidence that is contrary to our expectations, and much more to examine apparent contradictions with an open mind. When conceptual change does occur, recent research[3] indicates that newly acquired learning and understanding are layered over older, intuitive beliefs but do not supersede them.

5) Political and religious affiliations, and the shared values that come with them, protect our identities and self–interests. Group wisdom can shield us from harmful disinformation and exploitation.

Motivated Reasoning and Confirmation Bias

We employ motivated reasoning or confirmation bias (MRCB)[4] when we process information and make decisions based on some desired goal or outcome. But we rarely are conscious that we are perceiving or thinking in this way. Our implicit motives are varied and include loyalty to a group or institution, personal and financial needs, the preservation of identity, and the uncertainty and anxiety that can arise when we evaluate credible information that conflicts with core beliefs.

Most of us process information that reinforces our existing beliefs more favorably than when it contradicts them, and we tend to dismiss or disregard new information that casts our past choices and behavior in a negative light. Yet, a recent study has shown that those who deny the validity of human evolution or anthropogenic climate change are not irrational or scientifically illiterate[5]. On average, they are as rational and well informed on these issues as those who promote them. Given that we derive tangible emotional and social benefits from satisfying our core beliefs and needs, MRCB is perversely rational in some sense and promotes physical and social survival and success.

Cognitive Processing and Emotional Reasoning

When we approach problems that require decision making, we must draw upon two reserves of cognitive resources. Intuitive knowledge and core belief systems are easily accessible, and using them requires only a modest investment of time, effort, and mental energy. Psychologists call this Type I or System I thinking or processing. It is sufficient for many purposes, especially when a decision involves relatively straightforward issues, or the consequences of a bad decision are not very harmful to our physical or financial health, self–identity, social status, or community norms. When Type I thinking fails or is inadequate, Type II thinking is brought to bear. It requires closer attention to external information, the use of more time and memory resources, and the suppression of emotional attachment.

In a Mother Jones article[6], Chris Mooney notes that emotions are expressed in neural processes more rapidly than conscious cognition, convincingly demonstrated by electroencephalogram (EEG) results. Again, it could not be otherwise. We must be able to react extremely rapidly when faced with threatening circumstances. These are called fight or flight responses, and they are emotional, not rational. They play into confirmation bias by being first on the scene when we must weigh evidence against intuition.

If a person who believes deeply in a divine origin for the world and all that is in it, especially human beings, is presented with evidence confirming an evolutionary origin, there can be an immediate, negative emotional response before any conscious processing of information can take place. Existing beliefs are reinforced, leading to an argumentative or contrary response. Some call this rationalizing, an ironic description of an emphatically non–rational phenomenon.

Maladaptive Behavior

Human decision making routinely leads to voluntarily adopted unfavorable outcomes. We frequently make decisions that are very clearly not in our best interests.

I am massively addicted to sweets (cookies, chocolate, and milk to wash them down), which when consumed in the quantities I prefer leads inexorably to obesity and increases my chances of developing Type II diabetes. If these sweets are not within reach, I cannot eat them. But I just cannot keep them out of reach. Despite my best intentions, on every third or fourth trip to the grocery store they somehow become piled up in my basket, and from there checked out and transported home. Then I am doomed. I cannot have just two cookies. Once I start eating them, I cannot stop, and two cookies become the entire package, augmented by the chocolate and milk. After I binge through the supply, rational cognition temporarily returns until the next relapse.

Why oh why do I make such choices? In addition to psychological considerations, might there be a physiological component? A recent study[7] performed functional magnetic resonance imaging (fMRI) of blood flow to the posterior medial prefrontal cortex (pMFC), a Vienna sausage–sized brain structure near the top of the skull. Activity in the pMFC is significantly higher when new information or the opinions of others coincide with our own beliefs in decision making processes. In another study[8], transcranial magnetic stimulation above the pMFC reduced choice–induced preference change, the tendency to justify past choices with negative outcomes by paradoxically increasing our current preference for them. Neither study reported or suggested causal mechanisms for their findings.

These studies suggest that I am at war with myself. I am not so much dismissing the opinions and beliefs of others, but my own! I feel bad about choices I have made that are inconsistent with what I know is best for me. For a while, rational decision making keeps me from repeating those bad choices. Then there comes a time when I can no longer sustain choosing what I do not like. I want what I want, and I want it now.

My recollection of these episodes is one where the rational, troublesome feeling that accompanies the thought of reaching for that box of cookies simply vanishes. I do not feel badly about it; I just do not feel anything. I have eliminated the dissonance by somehow eliminating the feeling causing it. I seem to be unable to think rationally without invoking parallel, contradictory emotional content. So counterintuitive.

True or False?

We must distinguish between fact, opinion, and belief on the one hand, and reality and truth on the other. These are philosophical concepts regarding statements (propositions, in the language of logic) about what are called states of affairs, situations or circumstances regarding some object or person, or some physical or mental process.

Fact

A fact is a state of affairs supported by direct experience or observation, which does not depend on belief. If I believe that a corned beef sandwich is in my refrigerator, that has no bearing on whether one is there. I open the door, and the sandwich either is there, or it is not. This presumes that my sensory functions are operating normally.

Facts are entities for which there exists some experiment or measurement whose outcome evaluates a proposition as true or false or answers a question. Opinions and beliefs are how a particular person feels about something; they can be formed and held in the absence of empirical evidence, and they are volatile, subject to conscious cognitive processes when one is presented with such evidence.

Reality

This is a metaphysical domain, one that transcends what science can reveal about the objective nature of the world. Metaphysics treats questions that cannot be answered by science, such as the existence of God, the nature of ethical behavior, and whether humans possess free will or are subject to inescapable destiny.

The question, “Is Global Warming Real?” is poorly posed. The subject is not a matter of reality, but a matter of fact. A better question is, “Is the world warmer today than it was in the (recent, distant) past?”. This can be answered by quantitative temperature measurements over time. It is best to leave discussions of reality to philosophers.

Truth

When is something True? And if so, what is the nature of that something? There are several philosophical theories of Truth. One of the oldest and most familiar (Correspondence) goes back to Aristotle. The truth of propositions is determined by their relationships or correspondence with states of affairs existing in the world. Another, more modern notion (Pragmatism) defines a proposition as true if the assumption that it is true leads to some useful or practical result. These are somewhat contradictory, but they both are based on states of affairs, either those expressed as empirical facts or those for which there is a consensus that they are useful. The latter is not at all the same as, “The ends justify the means.”

The truth of propositions is relatively well defined, and they can be stated and answered using formal logic. The truth of opinions is another matter. Opinions are by definition a statement of belief or feeling, and while they can be true or false, they are not necessarily so. Some opinions are clearly subjective: “Chocolate ice cream is better than vanilla.” This is undoubtedly true for some (including me!) but not for all. See the next section on the tactics of Denialism for a hilarious example. Some regard an opinion as an invitation to another to engage in a discussion.

One definition of Denialism is “The employment of rhetorical arguments to give the appearance of legitimate debate where there is none, an approach that has the ultimate goal of rejecting a proposition on which a scientific consensus exists.”[9] As described above, the motivations for denial are varied, but most center on the denier’s discomfort due to a conflict between deeply held core personal values and widely acknowledged scientific evidence to the contrary.

Those who study Denialism have noted striking similarities in denialist tactics across a broad range of issues: Vaccination, Climate Change, Smoking, Biological Evolution, HIV and AIDS, and the Holocaust. The similarities are so regular that they constitute a Denialism Tactical Playbook. It is inappropriate to call these a strategy, since denialists rarely have long term interests to defend and almost always are interested in short term personal gain. Their measures include:

1) Cast doubt on the science. Conspiracy theories are rampant among denialists. They accuse scientists of secret cabals to withhold relevant information or to disseminate false claims. Given the transparent nature of scientific collaboration, this is ludicrous. And as Benjamin Franklin noted, “Three can keep a secret, if two of them are dead.” Another tactic is to cite surveys of so–called scientists who reject the evidence. An examination of these fake experts reveals that very few have credentials in the area of interest.

2) Question scientists’ intentions and interests. If the evidence cannot be disputed, disparage the sources. Allege conflicts of interest, greed, self–seeking motives, and complicity with corporate goals and government regulatory agencies. This is best done with no evidence, since citing evidence that could be falsified would reveal the fraudulence of the aspersions.

3) Exaggerate differences of opinion. These are a normal, critical element of informed discourse. If the matter were settled, we would not be studying it. Denialists often “cherry pick” or cite isolated results to discredit the usually firm consensus in any field. Widespread controversy is alleged where none exists on major issues. By no means! The things on which scientists agree far outnumber those on which they disagree.

4) Fabricate or exaggerate potential harm. No medical treatment is 100% safe. If you have ever read one of those disclaimers accompanying a pharmaceutical prescription, your eyesight might fail before you reach the end of the list of possible side effects, and yes, many are quite harmful. But no medicine gets FDA approval if a side effect is abnormally morbid or lethal, or occurs more than a very small fraction of the time. There is always some risk vs. reward analysis that balances potential benefits against harm.

Denialists always omit these statistical conclusions from large scale clinical research studies, which scientists go to extraordinary lengths to quantify and communicate to their peers. One can argue strongly that denialists understand neither the science nor the treatment of uncertainties, although that very likely would not slow them down if they did.

5) Appeal to personal freedom. This resonates strongly among conservative Americans, who tend to value individual freedoms more than liberals do. These labels are not pejorative, just broadly descriptive. For an outstanding parody of this position[10], click here to see a father discuss the merits of chocolate and vanilla ice cream with his son. Fast forward to 0:52 to begin his disingenuous argument. His initial appeal to personal freedom is followed by an abrupt change of topic, a key feature of the tactic.

6) Accepting the science would require abandoning a core belief. The previous elements of the Denialist Playbook are most often found in organized programs of denial by communities and institutions. Conflict with core values is the cornerstone of individual denial, previously described in the sections on Belief and Conceptual Change, Confirmation Bias, and Cognitive Processing. This is often called Cognitive Dissonance, where unpleasant or painful emotions require the suppression of evidence associated with the conflict. We all are subject to this largely irresolvable tension. This is really the only area in which we might have an opportunity to engage with the other, one on one.

What to Do?

None of this makes much difference in practice. The phenomena are ubiquitous. We reject certain scientific evidence, not because we do not believe in science, but because it conflicts with core beliefs and desires. Unfortunately, understanding a conflict does not translate to its resolution. We have very little chance of changing another’s point of view with logical arguments grounded in empirical facts. Unaccountably, it works the other way. In studies of beliefs about contemporary climate change involving empirical evidence strongly suggesting a human origin, survey participants with existing suspicions and contrary beliefs not only did not change their positions, they became even more firmly entrenched in them.

How should we approach and interact with others whose beliefs and opinions differ so starkly from ours? The first thing to avoid is provocation that produces a defensive emotional response. Language which may seem innocuous to us can be incendiary to others. But larger considerations are 1) What do we want to accomplish? 2) What are our motivations? 3) Is the possible gain worth a probable loss?

Another approach is to ask someone why they do not believe in something. More often than we might imagine, they may not be familiar with details and may find themselves unable to raise a coherent counterargument. We need not capitalize on their discomfort. When a lack of understanding on either side of any issue is brought to light, that is a big step toward meaningful dialog.

In a recent paper[11] the authors remark, “It isn’t always effective to just communicate the facts [about climate change]. It is better to focus on a narrative that people can relate to, often by providing stories about how the scientific information was collected.”

A Live Science Magazine article[12] suggests that personal narratives are effective in connecting with others, especially when they involve harm. Here are some examples from the article:

1) Many studies on political differences focus on persuasion and how people’s opinions change, but opinion change is rare.

2) In over 15 separate experiments, researchers found that although people think they respect opponents who present facts, they actually have more respect for opponents who share personal stories.

3) Further experiments found that stories were most associated with increased respect when the experiences were relevant, harm-based and personal. People respected opponents most when they had been through something themselves.

4) It’s hard to maintain doubt when someone tells you, ‘Look, this terrible thing happened to me.’

5) You can still have respect for someone as a human being and appreciate the roots of their views, but at minimum you need to know what those views are.

Closing Thoughts

We live in an increasingly polarized and hostile culture. I have friends and family members that I love and respect but who feel differently than I do about one thing or another, sometimes quite strongly. I find that almost without exception, these matters just don’t matter when it comes to my personal relationships with them.

It takes two to tango. If a mutual endeavor is to succeed, both parties need to work cooperatively. The essence of compromise is putting aside some differences long enough to appreciate shared values. It also takes two to tangle. At times, there is so little common ground that any conversation is doomed to devolve into open conflict. Walking away from provocation is not a weakness, and if an argument seems inevitable, best to leave the other in a monologue.

All the best,
On vacation in sunny Tampa, FL,
Saturday, September at 10:15 AM,
Rex

References

[1] Public and Scientists’ Views on Science and Society, Pew Research Center (2015).
[2] In U.S., 46% Hold Creationist View of Human Origins, Gallup (2012)
[3] Scientific knowledge suppresses but does not supplant earlier intuitions, A. Shtulman, Occidental College
[4] Motivated Reasoning, Dan M. Kahan in Discover Magazine, 2011
[5] On the Sources of Ordinary Science Knowledge and Extraordinary Science Ignorance, Dan M. Kahan, 2017. The Oxford Handbook of the Science of Science Communication
[6] The Science of Why We Don’t Believe Science, May/June 2011
[7] Confirmation bias in the utilization of others’ opinion strength, Kappes et al., Nature Neuroscience, 2020
[8] A Causal Role for Posterior Medial Frontal Cortex in Preference Change, Izuma et al., Journal of Neuroscience, 2015
[9] Denialism: what is it and how should scientists respond?, European Journal of Public Health, 2009
[10] Thank You for Smoking, Fox Searchlight Pictures, 2005
[11] Escaping the paradox of scientific storytelling, Public Library of Science, 2018
[12] Facts don’t convince people in political arguments. Here’s what does., Stephanie Pappas, 2021

--

--

Rex Saffer the AstroDoc

Retired Physics Professor, Motorcyclist, Bridge Player, Voracious Reader, Philosopher, Essayist, Science/Culture Utility Infielder