The Role of Social Media in Spreading Fake News
The Role of Social Media in Spreading Fake News

The Role of Social Media in Spreading Fake News

Bots swaying elections, Russian hackers and influenced masses. What is Fake News and what are social media networks doing to stop its spread?

1st December 2017

Fake news is a term we have heard a lot in the past 2 years, especially as Donald Trump campaigned, rose to power and took office in the USA. This term has now spread internationally and has been cited as an influence in local elections the world over, from the UK to Germany. Collins dictionary recently crowned the phrase as one of the ‘words of the year’ and defines Fake News as follows;

fake news, noun: false, often sensational, information disseminated under the guise of news reporting

Prior to social media, we knew Fake News as Propaganda. While Propaganda has been used since Roman times, to alter public perception and skew results, the rise of the internet, social media and a lack of governing control has given immeasurable power to Fake News that Propaganda never had.

To understand why Fake News is so powerful it’s important we look at the different types of Fake News. The Telegraph UK defines the 5 main types of Fake News as follows;

  • Commercially-driven sensational content: Sensationalist content with little to no evidence. Click bait used to drive traffic to sites to generate advertising income. These articles could have titles such as “Trump actually related to Osama Bin Laden” or “Secret North Korean missile sites in the USA discovered”.  BBC reported that an entire village was getting rich in this fashion.
  • Nation state-sponsored misinformation: This is more traditional propaganda by the state or nation. Outlets in Russia, China and other states are often accused of producing content to swing public opinion, cause division or alter public opinion.
  • Highly-partisan news sites: These sites are often extremely volatile and open in their extreme views. They are usually marketed as ‘anti-mainstream’ and independent of major political parties and big business. This includes sites such as Breitbart News and Milo Yiannopoulos.
  • Social media itself: Twitter bots posting content, Facebook ads and YouTube content. These are not links outside of social media but are part of the social networks themselves.
  • Satire or parody: Light-hearted publications such as The Onion, Betoota Advocate and Daily Mash.

While sites such as The Onion and Betoota Advocate are clearly satire, this wasn’t always clear to users, with Facebook testing a “satire” tag in 2014. This was tested following backlash and user confusion on the platform. While the changes don’t appear to have been implemented long-term it does show the unconscious trusting nature of users and the effect this can have in large numbers. The changes were likely not implemented as users became savvier and more able to differentiate satire from real news.

In 2016 the US Presidential Election delivered a most surprising result with the election of Donald Trump. At the time Facebook CEO Mark Zuckerburg denied that Facebook had been implicit in helping spread fake news and that Russian agents had bought advertisements as part of an attempt to sway the election outcome, however, he recently released the following statement on the situation.

To put this in perspective, Mashable reported that in the final three months of the US Election Campaign Fake News had been shared and engaged with more heavily than news from sites such as BBC, New York Times, Washington Times and NBC News. Their analysis found that the 20 top-performing false election stories from hoax sites and hyperpartisan blogs generated 8,711,000 shares, reactions, and comments on Facebook while in that same time period, the 20 best-performing election stories from 19 major news websites generated a total of 7,367,000 shares, reactions, and comments on Facebook.

Oxford Internet Institute scholars Philip Howard, Bence Kollanyi and Samuel Wooley call Fake News “computational propaganda” and have cited it in their research as a possible factor in the 2016 US Presidential Election result surprise. They state that through the mobilisation of bots (mostly on Twitter) pro-Trump campaigners and programmers carefully adjusted the timing of content produced during the debates, strategically colonised pro-Clinton hashtags, and then disabled activities after Election Day. Their research also shows the timing of this AI mobilisation to align with the swaying results, especially in swing states, leading up to polling day. NPR discusses this effect in further detail, claiming it was largely orchestrated by Russian agents and similar effects were felt recently in the UK during #Brexit.

Interestingly their research in a recent German election, in which right-wing candidate Angela Merkel was elected, showed that only 20% of news stories shared in the lead up to the election were deemed as junk (Fake News). They attribute this lower level of Fake News to both the higher education levels in Germany and the public financing for several types of professional news organisations which dominated the conversation.

While reach and influence do not directly correlate, and influence is difficult to measure, the impact of exposure to misinformation has been well documented.

Facebook and Twitter don’t generate junk news but they do serve it up to us. We must hold social media companies responsible for serving misinformation to voters, but also help them do better. They are the mandatory point of passage for this junk, which means they could also be the choke point for it. – Philip Howard and Bence Kollanyi (2017).

As for the allegations of Russia, China and other global powers using their might to sway elections and plant Fake News stories? A recent report by Freedom House states that over 30 world governments are mobilising keyboard armies to attack government dissidents and help sway public opinion. Pressure is mounting on social media networks to provide clearer data on government involvement in elections and Fake News proliferation. The UK recently ordered Facebook to handover information pertaining to Russia’s involvement in their elections.

Facebook has been rolling out incremental changes including partnering with the Associated Press, Snopes, ABC News and FactCheck.org to check link accuracy, satire and disputed tags for links, as well as the most recent addition, the “More Info” button, however many are critical that these changes are not enough. Journalists working closely on these changes within Facebook recently spoke to The Guardian, stating that these changes were “way too little, way too late”, that the war against Fake News was failing and that their labour has been exploited for a Facebook PR campaign. Interestingly, Facebook has also just announced a tool for users that will allow them to see if they have been in contact with content created by The Internet Research Agency, a Kremlin-linked ‘research agency’ that is commonly viewed as a Russian propaganda vehicle. While this tool won’t be available until the end of the year or early next year, it is only focused on one content creator and many feel it isn’t enough.

Twitter has made changes to replies and how they’re viewed in an effort to silence bots on the platform as reported here by Mashable. These seem to have been effective however Twitter could still be home to up to 48 million bots according to CNBC. In reaction to this, a bot has even been created to track bots on the network.

While users and critics are unhappy with the slow progress of major networks it’s not surprising these corporate entities are shy to invest large amounts in the exercise, especially if changes potentially lower the levels of engagement on their platforms or alter their feed algorithms to cause less bubble and echo chamber effects. With this in mind, it doesn’t appear the flood of Fake News will be slowed anytime soon.

 

To keep up to date with the changes happening on social media each month be sure to subscribe to our newsletter and keep up to date with our monthly Social Media Updates.

 

Have an opinion?

We'd love to hear what you have to say in the comments below

Comment & Discuss

About the Author

Dan Stork

Community Manager

I am a passionate digital marketer and graphic designer, with experience including content production, social media strategy development and community management.