3Qs: What a sham(e)—how to filter out fake news

The spread of fake online news has become a hot topic of conversation, particularly in the wake of the presidential election. According to a BuzzFeed news analysis, the top-performing fake election news stories posted on Facebook in the final three months of the presidential campaign generated more engagement than the top stories from major news outlets like NBC News and The New York Times. And the vast majority of those stories were identified as either pro-Donald Trump or anti-Hillary Clinton.

Facebook CEO Mark Zuckerberg insisted that the social networking giant could not have influenced the election, but the $350 billion company has since banned fake news sites from using the company's advertising network to generate revenue. Google, which recently highlighted a fake story claiming that Trump had won the election's popular vote, has taken similar steps.

We asked John Wihbey, assistant professor of journalism and new media at Northeastern, to weigh in on fake news' effect on the election and what people can do to avoid fake, misleading, and clickbait-y sites.

A new finding shows that fake news may be "more viral" than real news. In your opinion, how much of an effect did fake news stories have on the outcome of the presidential election?

Debate over this very serious issue is taking place in the context of a confusing and contentious period of post-election analysis and self-reflection. Emotions are running high. But in my view, we first need to separate out the election impact issue from the issue of fake news.

The probability that Facebook's lack of policing of fake news influenced thousands of key voters in swing states such as Pennsylvania, Wisconsin, and Michigan is exceedingly small. To suggest that it definitely changed the minds of tens of thousands of people defies what we know about belief formation, voter preference, and political socialization: It takes a lot of conditioning and media exposure before people shift ideology and general party/candidate preferences. It is possible it had an impact, but I think the chances that misinformation on social media swung the election are vanishingly small. I would be much more concerned about talk radio, for example, where misinformation was broadcast relentlessly for months on end. That's where you'd get powerful media effects that reinforce beliefs and motivate or depress voter turnout. But in some ways, that's nothing new in elections.

We need our communications ecosystem to be better and healthier overall for democracy. I think Facebook could do a lot more, as could many other institutions. What the public perhaps doesn't know is that the company's own data science team has been studying these issues for years now. You can read some really fascinating papers they have produced, where they find that misinformation "cascades" quite easily across the social graph—and false messages frequently "run deeper" than viral content in general. They label these "rumor cascades." I have not spoken with the company, but I suspect it had seen this as an issue of academic interest largely and had not thought through what it might mean in a close election. They certainly will now. At one point, they seemed to be more engaged with the social science community broadly. I think they should redouble that.


According to a 2016 Pew Research Center report on the modern news consumer, only 4 percent of web-using adults have a lot of confidence in the information they find on social media. What steps do you think Facebook, Twitter, and other platforms should take to improve trust in the accuracy of the news that is being reported on their sites?

I've seen that sort of polling from Pew and others, and I'm sure it has merit. Yet I'm not sure it tells the whole story. Of course, if you ask someone about whether they have confidence in, or trust, information on social media they may balk. But that's sort of like asking someone, "Do you believe everything you hear on the street?" They will say "no." But the power of social influence is a well-documented phenomenon both online and offline. News shared among friends can have a powerful effect. Again, I don't think people are switching candidate preferences over a couple articles, per se, but information can reinforce a sense of collective belief and a perception of community consensus. We often process information culturally, through what we think are the common beliefs of our friends and family.

It's worth noting that Facebook seems to take a big public relations hit about every six months now for something it has done relating to the civic and political space. This fake news-and-the-election issue is one of the most serious. I would love to see Facebook reach out more to academia and to other groups that are concerned about civics and democracy. It should open up data (appropriately anonymized) to, for example, the network scientists at Northeastern to help study this problem. I'd also like to see them keep a public, running log of content they've taken down and the stated reasons why. If we are going to expect Facebook to get into the content-policing game in a robust way, we need to be sure that the process is transparent and free speech is respected.

One other thing that I'd like to see Facebook, and particularly Twitter, do is much more aggressively enforce standards in terms of banning trolling, harassment, and threats. The volume of discriminatory harassment of women and minorities, and anti-Semitic discourse, is just appalling on these platforms. Creating a more civil environment would begin to foster the conditions for more trust.

If social media users want to avoid fake information and false, misleading, and clickbait-y sites, what should they should do?

I would highly recommend examining any suspect stories or claims through the lens of highly credible sites such as Politifact.com and Factcheck.org. And I would try to consume news primarily on professionally produced journalistic platforms. I know the news media has its problems, but on the whole professional journalists work extremely hard to perform this vetting and sorting function to the best of their ability. They sort fact from fiction and try to provide accurate interpretations of issues and events. They succeed perhaps 95 percent of the time, but we hear a lot about the errors. As a society our trust in the press is very low right now, but more than ever, we need to begin rebuilding that trust and those bonds between communities and credible information sources.

Explore further: Zuckerberg: 'Crazy' to say Facebook influenced election