BEWARE: Russian Trolls!
Although the extent to which Russia has managed to impact
the 2016 Presidential elections is still debatable, there is no doubt that Russian
‘trolls’ managed to post extremely persuasive, powerful content that added even
more fury to Trump’s populist campaign. This blog post will look at the different
tactics used by the ‘trolls’ to convince people to vote for Trump.
The great thing about online content is that it makes individuals
feel like loads of people share their opinion, even if they are only a part of
a small community. Individuals overestimate how many people share their views - for example,
Australian researchers found that those who surrounded themselves with
like-minded climate change deniers thought that a quarter
of the population are climate change deniers like them, whereas in reality the numbers were
closer to 7% (Marshall, 2015). Equally, advertisements of kids using guns, as
shown in fig. 1, led gun supporters to believe that their views are a lot more
prominent than they actually are, which might've encouraged them to speak out about
their opinions!
Figure 1: Instagram
account created by Russian ‘trolls’ (White, 2017)
Not only are the supporters of Trump exposed almost solely to
the targeted, right-wing media, but they also ‘contaminate’ their close ones – just
as Christakis and Fowler (2007) found in their large social network experiment,
if your friend or friend of a friend is obese, you are more likely to be obese.
Therefore, just by hanging out with Trump supporters, it could be that you become
a supporter by social ‘contamination’! The advertisement by Russian ‘trolls’
would’ve been seen by friends of Trump supporters, who would assume that such
behaviour is the norm and follow the behaviour of their social circles.
There was a sense of self-threat shared by Trump supporters,
which manifested itself in hate towards immigrants. According to the self- categorisation theory, individuals strive to form close relationships with
those similar to them and establish differences with those dissimilar from them
(Marshall, 2015). The Russian ads attempted to emphasize the need to strengthen
the in-group by uniting Americans scared of immigrants on their ‘Stop All Invaders’
page. The simple, familiar messages, such as ‘Like and share if you want Burqua
banned in America’ (fig. 2) and ‘Every man should stand for our borders! Join!’
(fig. 3), aimed to unite the in-group against the outside threat. It has been
shown with many experiments that ‘self-threat appears to induce a state of
social dependency’ (Pratkanis, 2007), thereby strengthening the need to belong,
in this case, to groups that promote Trump’s values.
Figure 2: Content
posted by Russian ‘trolls’ on Facebook (Parlapiano & Lee, 2018)
Figure 3: Content
posted by Russian ‘trolls’ on Facebook (Shane, 2017)
The advertisements also aimed to demonize Hilary Clinton. As
highlighted in research by Kanouse and Hanson (1972), negative information has
a much stronger effect than positive information when making judgements.
Therefore, by demonising Clinton and not praising Trump, the trolls were able
to have more of an impact, as research found that negative descriptions of
persons lead to much stronger negative evaluations than positive descriptions
lead to positive evaluations (Hodges, 1974). A good example of this is the ad that compares Hilary to Satan. The use of a metaphor is very
effective, as targeted individuals could learn to associate the negative connotation
of ‘Satan’ with Hilary (Gilovich, 1981). Equally, it draws on the target
audience’s pre-existing religious beliefs. One research paper found that, when
messages presented to religious individuals had religious connotation, they
were rated as more convincing (Cacioppo, Petty, Sidera, 1982).
Figure 4: Content
posted by Russian ‘trolls’ on Facebook (White, 2017)
It is tough to evaluate how much impact the ads really
had, but it seems evident that they added to fury and dissatisfaction of Trump’s
supporters and their close ones, spreading the message and contributing to him
winning the election.
Bibliography
Cacioppo, J., Petty, R., & Sidera, J. (1982). The
effects of a salient self-schema on the evaluation of proattitudinal
editorials: Top-down versus bottom-up message processing. Journal Of
Experimental Social Psychology, 18(4), 324-338. http://dx.doi.org/10.1016/0022-1031(82)90057-9
Christakis, N., & Fowler, J. (2007). The Spread of
Obesity in a Large Social Network over 32 Years. New England Journal Of
Medicine, 357(4), 370-379. http://dx.doi.org/10.1056/nejmsa066082
Gilovich, T. (1981). Seeing the past in the present: The
effect of associations to familiar events on judgments and decisions. Journal
Of Personality And Social Psychology, 40(5), 797-808. http://dx.doi.org/10.1037//0022-3514.40.5.797
Kanouse, D., & Hanson, R. (1972). Negativity in
Evaluations. In E. Jones, Attribution: Perceiving the Causes of Behavior (1st
ed.). Morristown: General Leaning Press.
Marshall, G. (2015). Don't even think about it (2nd ed.).
London: Bloomsbury.
Pratkanis, A. (2007). The Science of Social Influence (1st
ed.). New York: Psychology Press.
Marshall, G. (2015). Don't even think about it (2nd ed.).
London: Bloomsbury.
Shane, S. (2017). These Are the Ads Russia Bought on
Facebook in 2016. Nytimes.com. Retrieved 20 March 2018, from
https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html
Wagner, K. (2017). Facebook’s reliance on software
algorithms keeps getting the company into trouble. Recode. Retrieved 20 March
2018, from https://www.recode.net/2017/9/14/16310512/facebook-mark-zuckerberg-algorithm-ad-targeting-jews
White, J. (2017). These are the most bizarre Russian
election-meddling posts that have just been released. The Independent. Retrieved
21 March 2018, from http://www.independent.co.uk/news/world/americas/us-politics/facebook-russia-ads-posts-hillary-clinton-most-bizarre-posts-election-2016-a8032436.html
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.