BLOG

Veröffentlicht 20. Juni 2018 von Arunima Roy

Lucidity in the Post-Factual Era: How to Unsee the Emperor’s New Clothes

„What can you do against the lunatic who is more intelligent than yourself, who gives your arguments a fair hearing and then simply persists in his lunacy?“  George Orwell, Nineteen Eighty Four

 

After a tumultuous reign of 13 years, Emperor Nero committed suicide on 9 June 68 AD, surrounded by at least five witnesses. Within a few months, despite an official burial, rumor was that Nero was alive. By autumn that year a Nero appeared, and then another in a few years, and then another. Adding to the state’s woes, the three Nero claimants were popular with the public. Rather than being able to ignore the inept rabble-rousers, the state had to ‘eliminate’ them. Yet, citizens believed their emperor would one day return. In the end, this belief persisted for over four centuries – enough time for Nero, or any man, to have naturally died.

 

Nero’s death was questioned by Romans, which led to some interesting imposters who claimed to be the Emperor, back from the dead. Picture/Credit: Michael Wheatley

Mass hysteria and an unwillingness to look at facts objectively is not new it seems. Hans Christian Andersen mocked societal gullibility in his ‘Emperor’s New Clothes’. He was not the first. Similar stories have been around since medieval times, so people in the past too were somewhat aware of their intellectual failings. By 1841, our accumulated foibles took up three volumes of Charles Mackay’s sensational “Extraordinary Popular Delusions and the Madness of Crowds”. These anecdotes are a testament to humanity’s longstanding fascination with conspiracy theories and a preference for emotional arguments over facts and reasoning.

So what has changed now? What has deemed our age worthy of the title ‘post-factual’? Perhaps we are horrified (and rightly so) that we, in the 21st century, can be prey to false information. We are more educated, more eager to seek knowledge and have easier access to information than previous generations. Yet, that we are no different from our credulous ancestors is hard to digest.  

Is it that more people are vulnerable to fake news and conspiracy theories now than before? This may be possible but is hard to quantify or prove. One aspect of our lives that has changed is the rapid speed at which information now travels. So, false information can penetrate easier, faster.

“But wait,” you say “doesn’t faster communication mean that both false and true information spread rapidly?” In sum, the damage caused by misinformation should be cancelled out by the good done by true information. Alas, that’s not the case.

Our brain is not infallible. It likes rewards, especially easily accessible ones, like memes, cat videos, rants disguised as opinion posts, etc. Cognitively demanding tasks are tedious and require mental energy and attentional focus. Reasoning, logical thinking and sifting through fallible information sources don’t stand a chance when presented next to gratifying social media tidbits. Seeking information that fits one’s world views and interacting with people who share the same view are also rewarding. The gratifying nature of these pursuits combined with their easy accessibility leads to a dangerous situation akin to addiction.

 

The inability to focus attention is further accentuated by information overload. Picture/Credit: Michael Wheatley

It is hard to focus on work when, say, a barrage of social media updates is demanding your attention. An inability to contemplate or to focus attention for a long duration of time on a single issue at hand is seen in a more severe form in a psychiatric disorder called attention-deficit/hyperactivity disorder (ADHD). ADHD affects about 11% of the population. A certain number of symptoms are required for the diagnosis, however, most of us share some traits. This inability to focus attention is further accentuated by information overload that accompanies our modern lives. So, can we deal with the post-factual age if we understand ADHD and recognise why we pay attention to cognitively gratifying information?

Paying attention is tricky. Understanding how we pay attention has been trickier. Say you want to read a book, but your roommate decides to watch Game of Thrones. What do you do? Chuck the book and watch the show? Read the book despite the vainglorious background score tempting you to the television screen? How well you are at getting yourself to read that book may well depend on you, or the specific circumstances that day (notwithstanding, what it is that you are reading. If it is Ulysses, I would probably watch the show, exam or no exam). But how do we make these decisions? Where in our brain are all the possible alternatives weighed up and a decision made as to where we should focus our attentional energies? These remain open questions.

What we do know is that it is hard to pick the book over the show or harder to do your own bit of critical analysis than read ready-to-consume opinion pieces. What we also know is how we can package information such that it goes from pleasing to bewitching. Science has provided a good understanding of our cognitive biases. Unfortunately, this has only made us more susceptible to groupthink as we use our knowledge to develop more and more cunning ways to capture people’s attention. We are at a state where, no one, not psychologists, not scientists, not the developers of these attention-grabbing technologies, nor even the person writing this post, are safe from the lure of newsfeed scrolling.

So here is what we can do: first, we must support research to understand how we (do not) pay attention. Second, we have to use that knowledge to develop solutions for our mental shortcuts, and not attention-grabbing technologies.  

Another solution can be to educate people to actively question information, seek out contradictory reports, and judge the quality of information presented on both sides – a method familiar to scientists. However, what happens when our information sources are muddied?

 

We seem to prefer emotional assertions over logical arguments. Picture/Credit: Michael Wheatley

Sources of information are hard to scrutinise and what at first seems legitimate may well be propaganda. I discovered this a couple years ago when I and a group of friends were discussing the dehydrating potential of soft drinks. Unfortunately, all the scientific articles we found were financially supported by some or the other caffeine company making it very hard for us to find an opposing, unbiased viewpoint. In the end, we managed to find zero scientific studies on that topic that were not funded by caffeine companies. We never resolved that argument.   

What is worrisome is that such potential conflicts of interests are tucked away, and many may not know where to look for them. A lot of shoddy science can also be concealed if one is not familiar with statistical or scientific terminology. We must inform people on strategies to dig up details on conflicts and vested interests. Often, the public is not even aware of legitimate references and where to find them. This is another reason for our susceptibility to false information. We have the resources at our disposal but have not been taught how to use them. It has been said that people in this age are keener to seek out knowledge (and resources) to support their claims than those before them. However, more often than not, these sources do not provide empirically-derived results.  

So there seem to be two ways out here – one, science communication needs to move beyond retelling new discoveries in a consumable format to empowering the public to do their own research. This includes teaching analytical skills and recommending verifiable information sources. Two, we must encourage research into attentional mechanisms. This would help us to eventually develop strategies for tackling information overload and keeping instantly gratifying information bites at bay.

A looming question, however, is whether we should attempt to alter people’s preferences in this manner. Perhaps science can give us tools to aid critical thinking and logical reasoning. However, is this ethical? If our civilisation is to spiral down self-destruction and we can be saved via science, there is an argument to be made for doing so. But is upending the views of one group on others the morally correct thing to do?   

Arunima Roy

#LINO18 Alumna Arunima Roy works in public health as an analyst. She previously studied the effects of environment on mental health disorders, including attention-deficit/hyperactivity disorder (ADHD). When not at work, she will be either playing computer games or trying her hand at writing fiction. She is passionate about science communication and hopes one day to parent two dogs whom she will name ‘Crispr’ and ‘Cas’.