The Cold War.
Last year, a fellow psychology student and I lived with two other girls in Southern Leamington Spa. One of those was particularly uptight with money, and heavily resented paying for necessities such as gas and electricity. Due to our house being very old and damp, we really noticed the cold- our laundry sometimes took three days to dry, and you could often see your own breath in the downstairs rooms. So we had a heating war- a cold war, some could say. Come January, a bill arrived for us, and it was enormous. We knew that the penny-pinching housemate was going to be furious, so hatched a plan.
The two of us psychologists sat in our living room chatting away about our friend Jane, and how she had received a bill of £150! This was a lie; Jane had not been seduced by the temptress that was her boiler, unlike us. Our housemate laughed at how foolish Jane had been, and, once we knew that she had been successfully primed with our anchor, we revealed that our bill was £120. Five minutes of brief irritation and lots of “at least it’s not as large as Jane’s” later, our housemate had been appeased, a significantly smaller amount of rage than had been anticipated when we had originally received our damning email from SSE.
So why did this work? Anchoring is the tendency to be biased towards the starting value (anchor) when making quantitative judgments. This is one of the ways in which our brain is lazy; we are relying on our system one (fast, intuitive, relies on heuristics) and ignoring our logical system two (effortful thinking). Cognitive biases such as this tendency to favour a starting value are examples of our system one being used. There has been much experimental support for this bias, such as the work of Greenberg et al. (1986), who performed a mock jury study, and found that if participants were asked first to consider a harsh verdict as a punishment, they produced stricter final decisions than those who had considered a more lenient punishment initially. Furthermore, during the actual Cold War, Plous (1989) carried out a survey whereby he asked half the participants whether they thought there was a greater than 1 percent chance of nuclear war, and asked the other half if they thought there was a smaller than 90% chance of nuclear war. He then asked for a quantitative estimate of probability, and found that those primed with the higher number estimated a much higher chance of nuclear war happening.
Although we had promised after the initial bill to try and keep a lid on the heating, it was a cold winter and the next bill was also fairly sizeable. We knew that we couldn’t use Jane as a scapegoat again, and it was me that drew the short straw and had to tell our housemate. However, the stars aligned for me, and just after I had told her about the bill, my chair collapsed with me in it, and she laughed so much that she forgot about the bill. This was convenient, although far less elegant than our initial anchoring plan, which I would highly recommend to all psychology students. It really does work, and if this kind of knowledge isn’t worth paying nine grand a year for, then I don’t know what is.
Greenberg, J., Williams, K. D., & O'Brien, M. K. (1986). Considering the Harshest Verdict First Biasing Effects on Mock Juror Verdicts. Personality and Social Psychology Bulletin, 12(1), 41-50.
Plous, S. (1989). Thinking the unthinkable: The effects of anchoring on likelihood estimates of nuclear war. Journal of Applied Social Psychology, 75, 811-832.