A well-known example of the severe consequences of misleading statistics is a scare regarding the contraceptive pill, reported in Gigerenzer et al. (2007). In October 1995, The Committee on Safety of Medicines warned GPs that third-generation contraceptives increase the risk of potentially life-threatening blood clots by 100%. As would be expected, this caused considerable distress amongst many women, many of whom stopped taking the pill, leading to unwanted pregnancies and an estimated 13,000 additional abortions (Furedi, 1999). The main issue here, however, is that although the risk did indeed increase by 100% in absolute terms, the relative increase was actually very small. The studies that the warning had been based upon reported an increase in risk from 1 in 7000 women suffering a blood clot when taking second-generation contraceptive pills to 2 in 7000 women taking third-generation pills, which all things considered is very minimal. Had people had a better understanding of statistics and the difference between absolute and relative increases, this issue could have been avoided.
Although this misleading presentation of a statistic was unlikely to have been intentional, in many instances, statistics and research are deliberately misrepresented by corporations for their own gain, which is something that the public needs to be made more aware of. For example, in order to gain readers, newspapers may choose to present statistics in the most shocking and attention grabbing way possible (Gigerenzer et al., 2007). Furthermore, in some circumstances, companies can manipulate the way in which they present statistics in order to sell more products. An example of this is when the toothpaste brand Colgate made the claim that ‘80% of dentists recommend Colgate’. Although this was true, the way in which it was written suggests that 80% of dentists recommend Colgate toothpaste over the toothpaste made by other brands. However, when taking a closer look at the study which was commissioned by Colgate, the dentists in the study were allowed to recommend more than one brand of toothpaste.
The main aim of our project was to reduce the likelihood of people being taken in by misleading statistics and claims by improving their statistical literacy. Kahneman (2011) distinguishes between two types of thinking: system 1 and system 2. System 1 thinking is fast and automatic, relying on heuristics and biases. When an individual is thinking in this way, they are more susceptible to believing misleading statistics. System 2 thinking on the other hand, is more analytical and based on reason. Therefore, one of our aims was to encourage people to use system 2 thinking when considering statistics by highlighting the consequences of relying on system 1 thinking and taking statistics at face value. We wanted to make people more aware of the ways in which the presentation of statistics can be manipulated to provide stronger support for claims, and encourage them to consider the motives of people who manipulate statistics in this way. However, whilst raising this awareness, we were conscious not to encourage a distrust in all scientific information by reminding people that there are still reputable sources that they can go to for information. Therefore, we also aimed to encourage people to do their own research and to consider information from multiple sources before implementing any significant lifestyle changes to ensure that they make the most informed decisions possible.
What we did and the techniques we used
We decided to target this tendency to use system 1 processing when viewing statistics and claims by designing posters that would themselves target that way of thinking, and help make individuals realise its risks. Our posters were composed of a headline or claim made in the past that would be seen as shocking, controversial or surprising, designed to attract people’s attention. For example, one claim was “Global temperatures have shown no significant increase since 1998?”. Climate change is often brought into debates, so this would be expected to grab people's attention regardless of their opinion. Once they looked at our poster, they would notice underneath in a smaller font, a brief explanation that this was a real claim made and how it was deemed to be misleading. The intention was for the explanation to bring the individual into system 2 thinking, reflect on how the claim caught their attention and use this experience as a reminder of how to engage in that way of thinking more often. An example of one of our posters is shown below.
We also designed Instagram posts with tips on how to avoid being misled, using the Financial Times article “Tim Harford’s guide to statistics in a misleading age” (Harford, 2018) and “How to lie with statistics” (Huff, 1954) as sources. For example, one tip was to “”Be aware when there are no numbers to back up statements such as ‘twice as likely’”. This was inspired by one of Harford’s tips encouraging readers to put things into perspective and question if a statistic is actually a big number. An example of one of the tips we posted is shown below.
We created seven different posters of real life examples of how information can be presented in a misleading way and seven different tips on how to avoid being misled which we put up over a period of a few weeks.The decision to post frequently and for an extended period of time was due to the mere-exposure effect and the availability heuristic. The mere-exposure effect states that repeated exposure to a certain stimulus can enhance attitudes towards an object, even if people don’t engage with it (Zajonc, 1968). The availability heuristic (Tversky & Kahneman, 1973) is a cognitive construct, which proposes that information presented in high frequency will be easier to retrieve and considered more important and true. This can be seen in Tversky and Kahneman (1973) who investigated the judgement of word frequency. They asked participants whether it was more likely that a word starts with the letter ‘k’ or has ‘k’ as the third letter, if you sample a word at random from an English text. It’s thought that people would answer the question by comparing the ease to which words come to mind that start with ‘k’ or have ‘k’ as the third letter. It’s easier to think of words that start with the letter ‘k’, resulting in the judgment of word frequency being mediated by availability, even though there are more words with ‘k’ as the third letter than the first (Mayzner & Tresselt, 1965). By posting over a three week period, those who follow @dontbefooled_pfc should be exposed to and have access to enough stimuli to see the importance of being sceptical and looking deeper into what they are reading.
All the tips and posters followed a similar design in order to have consistency in our message. We aimed to target a broad population of people, as Kahneman (2011) doesn’t specify if anybody is more susceptible to system 1 processing than others. We had two methods of distributing our posters in order to try to reach as many people as possible. Our first method was to put posters up around the University of Warwick campus, in places frequently visited by students, and in Leamington Spa so that they would be seen by the public. We stuck the posters up in threes to make them more noticeable. Our second method was to post them on Instagram in order to have the potential of reaching its millions of users, and allow it to be easier for people to share them with others.
Every poster and tip had the tagline, “Don’t be fooled”. This was done because repeating a message can increase the believability and acceptance of the communication by increasing the liking of an object through the mere-exposure effect (Pratkanis & Aronson, 2001). By repeating this message in addition to frequently posting on the account it was hoped that through the mere-exposure effect we would be able to encourage people to be critical about what they read.
Our project also aimed to use the central route to persuasion, a part of the Elaboration Likelihood Model (Petty & Cacioppo, 1986). It states that there are two routes, the central route and peripheral route, in which stimuli can be processed based on the audience's ability to think about the message and their motivation to do so. Through the peripheral route, persuasion will result from cues which are unrelated to the logical quality of the message such as general impressions, the individual’s mood and emotional cues given the context of the content. For example, companies may use celebrity endorsement to advertise their products. Here the audience might not be persuaded by the need for a product, but rather because a celebrity they like is using the product, so maybe they should use it too. The central route of persuasion will result from the audience being motivated to think about the message and consequently persuaded by its content. For example a voter deciding to vote for one political party over another based on the strength of their arguments and ideas in their message. Petty and Cacioppo (1986) state that attitude changes which result from the central route, show greater prediction of behaviour and persistence than attitude changes which result from peripheral cues.
In order to achieve this central route of persuasion, we had to ensure the audience were motivated to understand our message. Our first Instagram post aimed to do this. We told our audience the intention of our future posts so they knew what to expect once they followed the account. The real life examples which the audience may be aware of, along with tips on how to avoid being mislead by these kind of examples were used to motivate the audience by making our message seem directly relevant to individuals lives.
Measuring behaviour changeConsidering the nature of our project, it’s important to acknowledge the difficulty in measuring the numbers of people who were encouraged to change their behaviour and adopt a more critical mindset. However, our instagram page can give a good idea by showing how many people engaged with our message. We gathered almost 100 page followers and each post had likes from a number of people, with the most successful having received 24 likes. This suggests the posters were able to get people's attention, and may have continued to gather traction if given more time, hopefully making more people aware of the importance statistical literacy and helping them to avoid being taken in by misleading statistics and claims.
Ben-Zvi, D., & Garfield, J. B. (2004). The challenge of developing statistical literacy, reasoning and thinking. Dordrecht, The Netherlands: Kluwer academic publishers
Furedi, A. (1999). The public health implications of the 1995 ‘pill scare.’ Human Reproduction Update, 5, 621–626.
Gigerenzer, G., Gaissmaier, W., Kurz-Milcke, E., Schwartz, L. M., & Woloshin, S. (2007). Helping doctors and patients make sense of health statistics. Psychological science in the public interest, 8(2), 53-96.
Harford, T. (2018, February 8). Tim Harford’s guide to statistics in a misleading age. The Financial Times. Retrieved from http://www.ft.com.
Huff, D. (1954). How to lie with statistics. New York, NY: W. W. Norton & Company.
Kahneman, D. (2015). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Mayzner, M. S., & Tresselt, M. E. (1965). Tables of single-letter and digram frequency counts for various word-length and letter-position combinations. Psychonomic monograph supplements.
Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. In Communication and persuasion (pp. 1-24). Springer New York.
Pratkanis, A. R., & Aronson, E. (2001). Age of propaganda: The everyday use and abuse of persuasion. New York: W.H. Freeman.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive psychology, 5(2), 207-232.
Zajonc, R. B. (1968). Attitudinal effects of mere exposure. Journal of personality and social psychology, 9(2p2), 1.