We do not see the world as it is. We see it as we are.
Which sounds more interesting: "13 out of 1,000 asylum seekers committed at least one crime in their first year of residence", or, "987 out of 1,000 asylum seekers lead well-adjusted lives"? What do you think: Have official crime rates for murder and theft in Germany been increasing or decreasing over the last decades? And have you ever trusted a statement just because it seemed familiar?
People rely on their minds when searching for information, assessing relevance and evaluating likelihood. But what if our minds are not as rational and objective as we would like to believe?
In this kick-off article in the Bonn Institute series "Psychology in Journalism", we present a selection of cognitive effects that are directly relevant to daily journalistic work. They affect:
- What spontaneously attracts attention in a complex environment;
- Which information is best processed and stored;
- How information is reinterpreted according to current beliefs and goals;
- How people evaluate risks and opportunities;
- How trustworthy a piece of information appears to be.
These effects impact not only media consumers but also journalists as they research and conduct interviews, set up cameras and microphones, and piece it all together for publication. One of the best known and best studied biases is the negativity bias.
The negativity bias: Attuned to potential threats
Attention is a precious commodity. We don't have an unlimited amount of it, so we must prioritise at any given moment, and quickly if in doubt. It is therefore not surprising that many automatic processes are activated that give priority to loud, flashy or new stimuli. Similarly, negative information about potential threats spontaneously attracts more attention than neutral or pleasant, presumably harmless information.
In a nutshell:
In terms of information processing, bad is stronger than good. People pay more attention to negative stimuli, perceive them more quickly and easily, process them more deeply and remember them better. Negative events trigger stronger physiological and emotional responses. Moreover, experiments have shown that negative information has a greater influence on judgments and decisions, impression formation and the assessment of social relationships than stimuli that is evaluated to be neutral or positive (1).
The negatively biased selection, processing and evaluation of information also alter communication on a topic, such as through word choice. Apart from the fact that journalists themselves are affected by negativity bias, two other factors likely amplify the effect in journalism:
- Knowing that negative headlines work well to capture audience attention;
- Closely following what other media are reporting on, which additionally influences how topics are selected and presented.
The above-mentioned aspects interact dynamically via mutually reinforcing tendencies, leading to an "eternal permanent crisis" or the "daily end of the world" in news coverage, as Maren Urner vividly sums it up in the titles of her bestselling books (2, 3).
Example of negativity bias in selection and framing
Media coverage often uses simplistic and biased narratives to attract attention. A team of researchers analysed coverage of Hurricane Katrina, which struck the United States in 2005. They found that much of the wording used in the news largely lacked a reliable factual basis and was selectively biased toward negative aspects (4). The media conjured up near civil war-like conditions: The impacted population was portrayed either as marauding, looting hordes or as helpless victims in need of protection. According to the researchers, the coverage ignored the vast majority of people who behaved calmly and reasonably and showed solidarity with others. This not only violates journalism's responsibility to present a truthful picture; coverage of local initiatives could have also proven practically helpful and inspiring for others affected.
Negativity bias – but why?
The universal and robust nature of the negativity bias is often explained in evolutionary terms (5). Those ancestors who reacted quickly to signs of danger had a selection advantage: They ran away, saved their lives and continued to reproduce happily ever after – even if they may have occasionally overreacted and run away unnecessarily. In contrast, those who were less alert to danger signals were more likely to fall victim to predators, avalanches or floods, which probably led to the extinction of this "relaxed" gene over millennia.
In addition, it was essential for our Stone Age ancestors to recognise cheaters who did not adhere to the social rules of fair cooperation (6). Here, one negative act erases 99 positive experiences. Some politicians or companies have learned the hard way that once trust has been squandered, it is extremely difficult – or even impossible – to regain it.
What are the consequences?
Negativity bias not only leads to a selectively negative worldview. It also affects what we feel, what we think and even how we think and behave:
Negativity does not make us feel good – quite the opposite, in fact. The constant preoccupation with negative topics is emotionally stressful; it makes people sad or angry. However, the dose makes the poison – and the framing: Do we understand the background and context of a negative event? Are there different perspectives on it or ideas about what could be done? Or is it just bad news, bad news, bad news, and then the weather? Research shows that reports that include possible solutions leave people feeling less anxious, depressed and helpless than reports that focus exclusively on problems (7, 8).
Negative emotions also impair cognitive abilities, both in the present and the long term when such experiences accumulate. Our field of vision narrows under extreme stress, and we slip into tunnel vision mode. Negative emotions make us focus on details rather than the big picture, and the subjectively perceived scope for action is more limited than when we are in a good mood (9). In addition, concentration and memory suffer, and the ability to take on different perspectives is inhibited. We shut down and switch off, presumably to avoid further overstimulation.
As a result, we become less empathetic and less willing to make new social contacts (10). We are also less willing to try new things: When people in an experiment were given a choice between conventional and new, exotic snacks, participants in a negative mood were less willing to try the new ones (11). A far-reaching behavioural consequence of too many news-induced negative feelings is news avoidance, as the Reuters Digital News Report 2022 reconfirmed (12). This is presumably another protective mechanism.
Confirmation bias: Why we love being right
It's a good feeling to have known it after all – ideally all along. Being right is rewarding. It gives us the impression that we are smart and have things under control. Whether we actually are right seems less important.
In a nutshell: From the initial information search to the weighting and interpretation of information to memory recall, people prefer information that confirms their view. Moreover, people become quite creative in reinterpreting information to support their beliefs (14). When this is not possible, they sometimes tend to discredit the source as untrustworthy, perhaps even as "part of the conspiracy". In general, people prefer media that reflect and thereby reinforce their attitudes (15).
If you are a journalist planning a report and are convinced you know who is good, who is bad and what is ugly right from the start, you may easily overlook essential information that could prove you wrong. Don't leave that opportunity to others. Opening your eyes and ears to dissenting, even contradictory evidence will often make your piece more nuanced and likely bring it closer to the truth.
Like negativity bias, confirmation bias is one of the best researched phenomena in cognitive science. Early studies aimed to show that humans are inherently not good at testing hypotheses adequately, that is, critically. Good science, as Sir Karl Popper put it, requires people try to disprove themselves (16). But people prefer to seek confirming information when testing ideas. For example, people who are asked, "Are you satisfied with your social life?" report greater satisfaction than those who are asked, "Are you dissatisfied with your social life?" It is believed that people selectively retrieve examples from memory that match what they are asked about – those who seek will find. Thus, a biased question can unintentionally lead to a biased answer, at least in areas that are not completely objective and consistent (17). This effect is highly relevant for journalistic research and interviews.
Confirmation bias – but why?
As mentioned earlier, the feeling of being right fulfils our need for high self-esteem and control over the environment, both of which provide psychological safety. Moreover, processing confirming information costs less cognitive energy and is therefore more pleasant than the laborious analysis and integration of new facts. Processing information that contradicts our opinion triggers emotional stress in the brain, as shown in an fMRI study (19).
What does this lead to?
One of the main consequences of confirmation bias is overconfidence in one's biased worldview. When a selective search yields a few confirming facts and non-confirming facts are ignored or discarded as irrelevant, one's view ceases to feel like a mere opinion; instead, it feels like it is based on evidence. Moreover, small doses of counterarguments can even reinforce the conviction of being right. This is because finding explanations that resolve a discrepancy, ultimately reconfirming a belief, acts like an inoculation for later "attacks" on one's worldview (20).
More biases: When ease makes it harder
There is often little time for journalistic research, which encourages recourse to what are known as heuristics. These mental shortcuts enable quick conclusions, are usually sufficiently accurate in everyday life and require minimal cognitive effort. Sometimes, however, they lead us astray (21). We'll come back to some heuristics and biases in later articles of our series, so for the moment, we will limit ourselves to two:
- Availability heuristic: Let's think back to the initial question: Have murder and theft rates in Germany trended up or down in recent decades? When you ask people to estimate frequencies or probabilities, they usually try to recall memories of relevant individual cases. Some are very vivid and therefore well "available" i.e. they come to mind easily. This perceived ease of memory retrieval leads us to the heuristic conclusion that there must be many more such examples; in other words, that such cases occur frequently (21). Because murders and grand larcenies are newsworthy and draw media coverage – which may be intense depending on the context – vivid, readily available memories are created, and the overall frequency of such incidents is overestimated. In addition, more recent events are often better remembered than those that occurred longer ago. So taking our initial question as an example, this effect further supports people's mistaken idea that crime rates are rising: In this instance, people are frequently exposed to (new) crime stories in the media, although the actual numbers have been clearly declining for many years (22).
In short, our estimates of frequency and likelihood are impacted by how easy it is to retrieve something from memory. Furthermore, the ease of perceiving and processing information affects how strongly someone feels something to be true and how they emotionally evaluate it:
- Illusory truth effect. "They will try to steal the vote. They stole the vote. They stole the vote!" If you say that often enough, people will be inclined to believe it. This is the illusory truth effect: People grow more used to a piece of information the more often it is presented to them; it becomes easier to process cognitively, and it feels familiar. As a result, it is more likely to be considered true, even if people actually know better (23).
- Because feelings of familiarity are inherently pleasant, people will also prefer stimuli (e.g. a logo or melody, or a line of reasoning) that they have seen or heard before over new ones. This is known as the mere exposure effect (24). This simple strategy of repeated presentation is often used in politics or by authorities and institutions, as well as in the marketing and advertising industries to influence preferences and choices by creating feelings of familiarity.
Being exposed to strategically repeated misinformation is one of the occupational risks of being a journalist. Another is providing a platform to those who repeat their own "truths" or seek mere presence in media coverage.
The Dunning-Kruger effect or 'We don't know what we don't know'
One final thing to note: The less people know about a subject, the more knowledgeable they consider themselves to be. Called the Dunning-Kruger effect (25) for its discoverers, this irony is based on the fact that it takes a certain minimum level of knowledge and understanding to grasp a subject area's abundance and complexity of facts and contextual conditions. Only then can one know with Socrates that one knows (still almost) nothing. Yet others consider themselves experts after watching a 10-minute video. The Dunning-Kruger effect might partly explain why people who are aware of their cognitive biases tend to be much humbler when it comes to claiming objectivity – or rather, why the unaware tend to the opposite.
Tools and tips: How to apply this knowledge to your daily journalistic work
How you can counteract the negativity bias:
Imagine wearing filter glasses during research that only show favourable developments or possible solutions. What would you see? Include at least some of these aspects in your reporting. They may seem minor, or perhaps they appear only at second glance, but they can help remedy a negativity bias. This in no way means your article will sugarcoat reality.
Instead, it will present a more complete picture, one that includes problems that need to be solved and ideas for possible solutions. Presenting examples (e.g. other places) where things are going a little better can even increase the pressure on politicians or companies, who like to argue that there are constraints and that no other way exists.
How you can counteract the confirmation bias:
- Whatever seemed to be the essence of your report so far – consider the opposite: The protagonist is innocent; the company means well; the crazy kid is right. What kind of questions would you now ask? Where would you go? What statistics would you draw upon? This strategy is also known as "playing devil's advocate." To help your sources also avoid negativity or confirmation bias, ask unexpected and solution-oriented questions. This can add an interesting twist to your story and paint a more nuanced, complete picture.
- Make it a team ritual to have one part of the desk or department wear the "devil's advocate" hat (or take on another similar symbolic position). Their job is to then identify unbalanced reporting and challenge it with solutions-tinted glasses or with a "consider the opposite" mindset and research prompts that counter biases. Alternate who does this on a monthly basis. This can foster more flexible thinking and, in the best case scenario, rewards you and your audience with more truthful stories and inspiring new perspectives.
- Surround yourself with a wide variety of people from different backgrounds. What contexts and people might keep you from ending up in a social bubble of mutual validation and illusory truth effects due to the constant repetition of similar views? Consider how different dimensions of diversity in your newsroom might help ensure more diverse perspectives in the topics you chose to cover and the reports you produce.
- Be wary of frequently repeating certain statements when moderating debates or hosting other formats, as some people may abuse the podium to spread "fake truths".
- Norris, C. J. (2021). The negativity bias, revisited: Evidence from neuroscience measures and an individual differences approach. Social Neuroscience, 16(1), 68-82.
- Urner, M.(2019). Schluss mit dem täglichen Weltuntergang: Wie wir uns gegen die digitale Vermüllung unserer Gehirne wehren. [Stop the daily apocalypse: How to fight back against the digital littering of our brains.] München: Droemer
- Urner, M. (2021). Raus aus der ewigen Dauerkrise. Mit dem Denken von morgen die Probleme von heute lösen. [Getting out of the eternal crisis. Solving today's problems with tomorrow's thinking.] München: Droemer.
- Tierney, K., Bevc, C., & Kuligowski, E. (2006). Metaphors matter: Disaster myths, media frames, and their consequences in Hurricane Katrina. The Annals of the American Academy of Political and Social Science, 604(1), 57-81.
- Rozin, P., & Royzman, E. B. (2001). Negativity bias, negativity dominance, and contagion. Personality and Social Psychology Review, 5(4), 296-320.
- Cosmides, L., Tooby, J., Fiddick, L., & Bryant, G. A. (2005). Detecting cheaters. Trends in Cognitive Sciences, 9, 505-506.
- Curry, A., Stroud, N. J., & McGregor, S. (2016). Solutions journalism and news engagement. Engaging News Project/Annette Strauss Institute for Civic Life at the University of Texas Austin.
- McIntyre, K. E. (2015). Constructive Journalism: The Effects of Positive Emotions and Solutions Information in News Stories. Dissertation at the University of North Carolina at Chapel Hill.
- Fredrickson, B. L., & Branigan, C. (2005). Positive emotions broaden the scope of attention and thought‐action repertoires. Cognition & Emotion, 19, 313-332.
- Lyubomirsky, S., King, L., & Diener, E. (2005). The benefits of frequent positive affect: Does happiness lead to success? Psychological Bulletin, 131, 803-855.
- Kahn, B. E., & Isen, A. M. (1993). The influence of positive affect on variety seeking among safe, enjoyable products. Journal of Consumer Research, 20, 257-270.
- Newman, N., Fletcher, R., Robertson, C. T., Eddy, K., & Nielsen, R. K. (2022). Reuters Institute Digital News Report 2022. Available at https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2022
- Fredrickson, B. L. (2013). Positive emotions broaden and build. In Advances in Experimental Social Psychology (Vol. 47, pp. 1-53). Academic Press.
- Klayman J (1995). Varieties of confirmation bias. Psychology of Learning and Motivation 32, 358–418
- Knobloch-Westerwick, S., Mothes, C., & Polavin, N. (2020). Confirmation bias, ingroup bias, and negativity bias in selective exposure to political information. Communication Research, 47, 104-124.
- Popper, K. R. (1963). Science as falsification. Conjectures and Refutations, 1, 33-39.
- Kunda, Z., Fong, G. T., Sanitoso, R., & Reber, E. (1993). Directional questions direct self-conceptions. Journal of Experimental Social Psychology, 29, 62–63.
- Cosmides, L. (1989). The logic of social exchange: Has natural selection shaped how humans reason? Studies with the Wason selection task. Cognition, 31, 187-276.
- Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election., Journal of Cognitive Neuroscience, 18, 1947–1958.
- Compton, J. (2013). Inoculation theory. The SAGE handbook of persuasion: Developments in theory and practice, 2, 220-237.
- Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases: Biases in judgments reveal some heuristics of thinking under uncertainty. science, 185(4157), 1124-1131.
- Bundeskriminalamt (o. J.). Polizeiliche Kriminalstatistik. Online verfügbar unter: https://www.bka.de/DE/AktuelleInformationen/StatistikenLagebilder/PolizeilicheKriminalstatistik/pks_node.html
- Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144, 993- 1002.
- Montoya, R. M., Horton, R. S., Vevea, J. L., Citkowicz, M., & Lauber, E. A. (2017). A re-examination of the mere exposure effect: The influence of repeated exposure on recognition, familiarity, and liking. Psychological Bulletin, 143, 459-498.
- Dunning, D. (2011). The Dunning–Kruger effect: On being ignorant of one's own ignorance. In Advances in Experimental Social Psychology, 44, 247-296.