By Eric Vandenbroeck 9 Nov. 2018

It is no secret that there are about one hundred or so biases that have been repeatedly shown to exist and can make a hash of our lives. Forlists of cognitive biases see also here and here.

As someone who works with information a lot, I am aware that there are many tendencies in both the context and how we think that can lead us astray and distort our view of the world. But this also brought up the question if there is something we can do about it?

Here are a few hints that I found out about and tend to have worked for me. The following eight ideas aren't just relevant for those rare occasions where youíre asked questions about social realities in quizzes, or one of our surveys, or to impress people on your table at wedding dinners with your knowledge of teenage pregnancy rates around the world. They have broader applications for how we see the world, what we prioritize and how we approach new information. Iíve started with points more related to how we think as individuals, moving through to more society-wide actions.

1. Things are not as bad as we think, and most things are getting better

Emotional innumeracy is one of the most important concepts in explaining why weíre so wrong on so many social realities. Our concern causes our overestimation, as much as being the result of it. This makes misperceptions a useful clue to what really does concern us, but it also means that we can control our misperceptions if we recognize what weíre worried about.

This is related to a broader point that most social realities are getting better. This isnít true of everything and those that are improving are often not getting better as quickly or as much as weíd like. But starting with the assumption that most things are improving over time is more likely to be accurate than the opposite.

This shortcut is not just useful because we miss the great strides that are being made. Itís important because weíre wired to think the opposite. We tend to suffer from "rosy retrospection", where we edit out the bad from the past and emphasize the good. This is a useful human characteristic, as it stops us dwelling on our historical pain, and frees up more mental space. But it also encourages a faulty view that today is worse than ever. Itís crucial we avoid this perception as we know that some sense of success is an important motivator for both how we act and feel. More than this, too pessimistic a view of how things are changing can cause extreme reactions, where we rip up whatís been achieved because weíre blind to the progress we have made.

2. Cultivate skepticism, but not cynicism

In Annals of Gullibility: Why we get duped and how to avoid it, Stephen Greenspan suggests we cultivate skepticism but not cynicism because there are dangers in being too far on either end of the spectrum.(1) Itís a difficult line to tread, but a vital one.

Weíve seen throughout that one of the fundamental challenges in building an accurate view of the world is our deep desire to avoid cognitive dissonance, to let go of things we already believe. This leads to all sorts of quirks of confirmation bias, directionally motivated reasoning and asymmetric updating that allow us to dismiss contrary information and only take the points that support our case.

However, some skepticism is valuable and attitudes should have some inertia, otherwise we would be flopping around, always believing the last thing we heard.(2)

Cynicism allows us to dismiss contrary information too easily, but being too open allows us to be easily duped. The media environment is full of extremes we need to guard against. Itís not just about the gore implied in the journalistic clichť "If it bleeds, it leads." Evan Davis, the BBC journalist, in his book on post-truth tells of another old adage in the media: "first simplify, then exaggerate." As he describes it, those who work in the media themselves have to sell their programmes to editors and audiences, and that sometimes means trying to make them sound big even if the material is "small or medium." He outlines how a fact is reported, a legitimate interpretation is placed on the fact, but then it is "puffed up to a magnitude beyond anything it deserves." Itís much easier to get pulled into this much more common trap than by anything concerning "fake news."(3)

James Pennebaker is a US social psychologist, most famous for his experiments that show how just writing about our emotions can improve our health. He also suggests a more active way of interacting with the media, advising that we change how we consume news, from passive receptivity to actively thinking about the information and trying to make sense of it. In our online world, this is akin to the lateral reading strategies used by fact-checkers, verifying as we go. This may be too exhausting to do all the time, but a little more could help.(4)

3. Accept the emotion, but challenge the thought

I realize this reads as an Ďinspirationalí Facebook thought for the day. Itís true, this quote is based on a line from a self-help book on mid-life crises by Andrew G. Marshall (not that Iíve read it), but it also works perfectly for how we see realities.(5) Denying that we have an emotional reaction (whether positive or negative) is pointless and impossible, but accepting these emotions and trying to understand them is not. Tempering our immediate emotional reactions with more deliberative, a contemplative thought is much more difficult, but thatís the key.

4. Other people are not as like us as we think

When there is so much confusing and apparently contradictory information around, itís understandable that we have a natural tendency to fall back on our own direct experience and assume that all we see is all there is. Some of the biggest errors we see in our estimations can be traced back to thinking we, and our circle of friends, are absolutely typical. This is a problem not just because we are often not as typical as we think (as with online Indians), but also because we are often very wrong about our own characteristics (for example, when we underestimate our own weight or sugar consumption). An appreciation of how different other people are, and how misguided we can be about ourselves, is important in forming a more accurate view of the world.

5. Our focus on extreme examples also leads us astray

On the other hand, there are also many examples where we stereotype others, often assuming the worst. We need to consider the extent to which our views are affected by that one vivid anecdote that we remember. Weíre naturally drawn to extreme examples, which means that true but vanishingly rare events or populations take up more of our mental capacity than they deserve. We think of destitute asylum seekers when asked about immigration, we think of the one vivid story on teenage mums and are distracted by the horror of the most lurid terrorist incident. But these are not representative, most things are not so remarkable. The norm is usually more boring than our mental image.

Combatting this is partly just about knowing where you sit within your society, appreciating its diversity, but also about opening yourself up to different perspectives.

6. Unfilter our world

In our increasingly online existence, opening up our perspective means trying to pop our filter bubble and break out of our echo chamber. There are no easy answers, but there are answers that are good to engage in. With things like the Facebook/Cambridge Analytica scandal unfolding, weíre likely to see pressure grow for at least some more actions to be taken.

On a personal level, there are a growing amount of tools that are available. For example, FlipFeed allows you to randomly see the Twitter feed of someone with a diametrically oppositional view to your own. The "Read Across The Aisle" app positions itself as a health aid for our confirmation bias: "this app will notice when youíve gotten a little too comfortable in your filter bubble, and itíll remind you to go see what other folks are reading."(6)

Mainstream media outlets are trying similar approaches. The Wall Street Journal created "Blue Feed, Red Feed" to reflect the different political slant of content. BuzzFeedís "Outside Your Bubble" pulls in opinions from across the spectrum of views and the "Burst Your Bubble" weekly column in the UKís Guardian curates "five conservative articles worth reading" for the paperís more left-leaning audience.(7)

7. We also need to tell the story

Although facts are important, they are not sufficient given how our brains work. We need to be aware of how people hear and use them, turning them into stories, that might not always lead to the right conclusions. This echoes psychologist Robert Cialdiniís concern about the dangers of using descriptive norms (that is, what the majority are thinking or doing) to illustrate how serious an issue is. Telling people that most people are overweight or obese is a useful fact to shock us out of complacency, but as well as hearing that obesity is a big problem, there is a real risk that people hear itís normal. As we know, we follow the herd: if we hear that other people are doing something, we are more inclined to do so too, even if itís bad for us.

This is why campaigners on contentious issues have learned to focus on a story, rather than statistics.

Michael Shermer, the science writer and founder of the Skeptics Society, highlights steps you can take to convince people of errors in their beliefs, including: including the importance of discussing not attacking, acknowledging that you understand the basis for an opinion and trying to show how changing our understanding of the facts doesnít necessarily mean changing our entire worldview.(8)

There is no contradiction between facts and stories; you donít need to choose only one to make your point. The power of stories over us means we need to engage people with both.

8. Facts still count, and fact-checking is important

The literature on the use of facts to correct misperceptions shows very mixed results. It sometimes works, it sometimes works in a limited way, and it sometimes doesnít work at all. The effects sometimes seem to last over a longer period, and sometimes they donít. It depends a lot on the issue being tested, how itís done and what weíre expecting to shift.

That makes perfect sense when we bear in mind the theory of cognitive dissonance and consider what we know about how we think. We naturally look for confirming information and discount disconfirming information. When the evidence reaches a tipping point and there is sufficient weight against our current view, we switch. The dissonance is emotionally unpleasant, and while weíre attached to our current opinions, it becomes less unpleasant to shift than to cling on to them.

The message is that we canít always solve misperceptions with more facts alone, but that we definitely shouldnít give up on them entirely. People are marvelously varied, and different approaches work with different people in different situations.

Regardless of the effectiveness of correcting people or information, there are ethical considerations. Itís just wrong to misuse facts, and there should be accountability, particularly when disinformation has such significant consequences, as with vaccine take-up. Itís easy, but incorrect, to conclude that people are just stupid when theyíre actually being exploited or failed by those creating and controlling information.

Without deterrents and without the threat of being picked up and corrected, the extent of disinformation will be much worse. Fact-checking may be a minor deterrent to those who donít really care, but some do, and being pulled up has already shifted behavior.

Of course, fact-checking is about more than correcting disinformation that is already out there or shaming those who create or propagate it. It is increasingly about getting in first, building fact-checking into the system and stopping the disinformation before it starts. We need to invest in these approaches with commitment and ingenuity that at least equals those who are developing tools and content to spread disinformation.


1. S. Greenspan (2009). Annals of Gullibility: Why We Get Duped and How to Avoid It. Praeger.

2. C. S.Taber, & M. Lodge (2006). Motivated Skepticism in the Evaluation of Political Beliefs. American Journal of Political Science, 50(3), 755Ė769.

3. E. Davis (2017). Post-Truth: Why We Have Reached Peak Bullshit and What We Can Do about It. Little, Brown.

4. J.W. Pennebaker & J.F. Evans (2014). Expressive Writing: Words that Heal. Idyll Arbor.

5. A. G. Marshall, (2015). Wake Up and Change Your Life: How to Survive a Crisis and be Stronger, Wiser and Happier. Marshall Method Publishing.

6. Read Across the Aisle (n.d.). A Fitbit For Your Filter Bubble:

7. C. Wardle, & Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making.

8. M. Shermer (2016). When Facts Backfire. Scientific American, 316(1), 69Ė69.