A lot of ink and pixels have been devoted to the allegations that fake news and misinformation of Facebook and other social media sites may have swayed the recent US election Trump-wards, and that this may have been the first (but probably not the last) "post-truth" election.
Much has been made of the fact that Oxford Dictionaries have made "post-truth" their word of the year, as though this is in some way significant or even prophetic. Post-truth refers to circumstances where objective facts are less influential in shaping public opinion than emotional appeals, and you can see how this has become a buzz-word in a year of Brexit and President Trump.
But back to Facebook.
It has long been the known that social media (and Facebook in particular) s rife with false news (either deliberately or accidentally so). What has changed more recently is the extent of the influence that social media has over the perceptions of the general public. Studies suggest that nearly two-thirds of American social media users - and, let's face it, that's most people these days - get the vast majority of their news directly from that social media.
Facebook CEO Mark Zuckerberg insists, somewhat disingenuously, that 99% of the content reported on Facebook is "authentic" (as opposed to factual), and that he finds it "extremely unlikely" that false news on Facebook has been instrumental in influencing the American election one way or another. In fact, he has specifically denied that Facebook helped Trump win, calling such accusations a "crazy idea". But then he would say that, wouldn't he? Zuckerberg says that Facebook is working to identify and flag false news, although he is right to point out that this is a tricky line to walk and that Facebook staff should not be seen as, or be involuntarily put in the position of being, "arbiters of the truth". And certainly fact-checking everything that appears on the site does appear to be just impractical.
Part of the problem is that Facebook's News Feed service is specifically designed to show people the kind of news it thinks they want to see, creating a kind of "filter bubble" which merely serves to reinforce a person's views without exposing them to any alternative or contradictory viewpoints. What an individual ends up seeing (and believing) depends, to a large extent, on their friends and what they choose to share, which just exacerbates both confirmation bias and the so-called "backfire effect".
So, as a result, we have seen fake video of Democrats stuffing votes into ball at boxes, stories accusing the Clintons of murder, stories claiming that Barack Obama is a Muslim, false claims that popular black actor Denzel Washington has praised Trump, etc, etc. Hilary Clinton connected to a pedophilia and child sex trafficking ring, anyone?
And there is some evidence that, for whatever reason, these kinds of sensational fake stories are actually shared more on social media than other, more mundane, factual claims. One analysis by Buzzfeed News shows that the top fake news stories during the election generated significantly more engagement on Faceook (in terms of shares, reactions and comments) than the top real news stories from 19 major news outlets combined.
And there is some evidence that, for whatever reason, these kinds of sensational fake stories are actually shared more on social media than other, more mundane, factual claims. One analysis by Buzzfeed News shows that the top fake news stories during the election generated significantly more engagement on Faceook (in terms of shares, reactions and comments) than the top real news stories from 19 major news outlets combined.
So, did Facebook influence the US election? We will probably never know. But it does seem likely that the inexorable rise of fake news has at least increased the political polarization and confusion in a fraught and hotly-contested race.
No comments:
Post a Comment