Behaviour Change

PROPAGANDA FOR CHANGE is a project created by the students of Behaviour Change (ps359) and Professor Thomas Hills @thomhills at the Psychology Department of the University of Warwick. This work was supported by funding from Warwick's Institute for Advanced Teaching and Learning.

Sunday, December 11, 2016

It's not all blue or red!

It's not all blue or red!

Ever wondered why your Facebook feed appears so disproportionally outraged, shocked and surprised after election results? For example, following the Brexit vote or the 2016 American Presidential election? I know I have.

It has come out in the news recently that Facebook algorithms are programmed to bring you a personalised and tailored feed, according to what they think your opinions are. This has led the organisation to be widely criticised on its coverage of political news.

This website set up by the Wall Street Journal could be the answer. It illustrates the two extreme ways the 2016 US Presidential campaign was reported: the red, conservative side against the blue, liberal side.

Indeed, this eye-opening website highlights the presence of a polarised in and out-group. Depending which side your Facebook feed represented, you were probably only shown one-side of the story. I know I was. 

What are the consequences of such biased reporting and why?

Firstly, the theory of social proofing which argues people assume what is the correct behaviour by looking amongst their in-group, seems relevant in this context. A study by Weaver, Schwartz and Miller (2007) found that people inferred an opinion was most prevalent if it was familiar to them, even after being repeated to them just 3 times by the same source. In the context of Facebook news reporting, when social media users are repeatedly exposed to the same opinion they are likely to overestimate the prevalence of this opinion in the general population. An alarming consequence of this is that Facebook users from either side might feel less inclined to go out and vote as their preferred candidate appears well supported.

Furthermore, the issue raised by the Wall Street Journal is worrying according to Janis’ (1972) Groupthink theory. The latter is the idea that while not all group members actually agree, a strong desire for homogeneity and harmony leads the group to adopt irrational opinions. Indeed, the natural will to minimise conflict leads to the absence of critical thinking. Janis (1972) highlights that some observable consequences of groupthink are uniformity pressures, self-censorship and close-mindedness. In the context of political campaigns, debate is essential. However, if Facebook narrows down the groups to red and blue, social media users are forced to adhere to one side without the opportunity to navigate both sides of the argument and take their stance. 

The long-term effects of such biased reporting are severe. Indeed, this emphasis on in-groups and out-groups leaves little room for debate, compromise and tolerance and largely increases opinion polarisation.

If you were to take one thing away from this post it should be: it's not all black or white. Or should I say blue or red.  

Janis, I. L. (1972). Victims of groupthink: a psychological study of foreign-policy decisions and fiascoes.

Weaver, Schwartz and Miller (2007) Inferring the popularity of an opinion from it's familiarity: a repetitive voice can sound like a chorus. Journal of Personality and Social Psychology, 99(5), 821-833.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.