Experts weigh in on whether a for-profit journalism initiative can do what big tech can’t.
A battle between fact and fake news is currently playing out on social media and search engines. Foreign and stateside actors alike are using disinformation to sway American politics and to swiftly sow discord and even violence — continued Russian interference in elections, deaths in India and Myanmar, hateful rhetoric in America.
Fake news — difficult to spot, even among educated people — is one of the biggest problems of our time. And, like most big problems, it has proven difficult to solve.
But numerous organizations are popping up to try to combat it. Their attempts generally fall into two buckets: 1) making people more news-literate, or 2) making news more trustworthy, by either weeding out fake news or providing information about a story or site’s reliability.
A recent entrant, NewsGuard, seeks to do both (though mostly the latter). One thing that makes NewsGuard stand out from many other non-tech journalism initiatives is that it’s for-profit — it has received $6 million in venture funding from its founders and other investors — so it doesn’t need to rely on philanthropic donations. It has also enjoyed plenty of positive press (though not on the far-right conspiracy site Breitbart).
How NewsGuard works
Founded by big-name journalism veterans Steven Brill (founder of The American Lawyer, Court TV, the Yale Journalism Initiative) and Gordon Crovitz (former publisher of the Wall Street Journal), the startup engages a team of 25 trained journalists to determine whether a publication is a reliable news source. Using nine weighted criteria — including “does not repeatedly publish false content,” “regularly corrects or clarifies errors,” and “clearly labels advertising” — NewsGuard’s journalists regularly analyze and rate more than 2,000 of the biggest news and information websites.
News organizations are ranked on a scale of 1 to 100 points. A site needs more than 60 points to be rated with a green check for “trustworthy,” meaning that it “generally maintains basic standards of accuracy and accountability.” Sites that don’t meet that level are given a red exclamation point to signify that they’re unreliable. If you download the NewsGuard plugin (available on Chrome, Safari, Firefox, and Microsoft’s Edge on desktop, and on Edge on mobile), icons appear alongside articles on social media or in search results, where readers can click to see exactly why a site qualifies as a reliable news source or not. For those so inclined, the journalists include reported write-ups of each organization.
The New York Times — and Recode! — currently have green checkmarks for each criterion (100 points). The Fox News website passed (75) but was dinged for not regularly correcting errors and not disclosing its ownership and financing. Breitbart (57) didn’t make the cut because it doesn’t gather and present information responsibly, nor does it disclose its ownership on its website, among other issues.
Of the sites NewsGuard has analyzed, 16 percent have a red untrustworthy rating, while the vast majority have green ratings. The average score is 78.9 points.
It’s a low bar, given that sites that have numerous editorial and ethical problems can still get a passing mark, but it does give readers a rough sense of what not to read.
“We’ve done a lot of testing of our rating system,” NewsGuard general manager Matt Skibinski told Recode in an email. “While we’re still tweaking components of it, we think it’s about right in terms of where the red/green distinction falls.”
But can it actually work in preventing the spread of fake news?
A Gallup poll that was commissioned by NewsGuard and funded by the Knight Foundation found that the 62 percent of readers who downloaded the NewsGuard browser extension trusted the ratings more once they learned they were written by experienced journalists. Twenty-nine percent said it made no difference, while 8 percent said it decreased their trust (numbers are rounded).
The poll also found that 54 percent of people who saw a red rating said they’d be less likely to read the content, while 63 percent said they’d be less likely to share. The unreliable ratings made a truculent 7 and 5 percent say they were more likely to read and share, respectively. The rest said it had no effect on them.
This poll seems to conflict with another recent Gallup poll that shows that the majority of the population perceives bias in the media — and by extension among journalists themselves.
NewsGuard’s method of greenlighting entire publications (rather than articles individually) means it’s possible some fake news is still getting through. However, that same method is precisely what makes NewsGuard scalable. It’s feasible for a team of 25 journalists to check in on the top 2,000-plus websites every few months; fact-checking their individual articles, which is something Facebook and others have attempted to do, would take an exponentially bigger team.
Evaluating whole sites fits with findings in a recent Knight Foundation report, which said that the same sites were responsible for a majority of misinformation on Twitter before and after the 2016 presidential election. (Twitter takes issue with this study because it uses Twitter’s API, which the company says doesn’t include actions taken against malicious accounts like locking or suspending.) The idea would be that by flagging unreliable sites, NewsGuard could stop people from reaching a great deal of fake news.
What stands in NewsGuard’s way
One thing that stands in NewsGuard’s way is its present distribution.
Currently, NewsGuard requires users to download a browser plugin on any major desktop browser or to turn on the function if they happen to use Edge, the only mobile browser on which it’s available. But if you’ve even heard of NewsGuard (let alone downloaded the extension), chances are you’re already a pretty savvy news consumer; people who consume fake news aren’t usually people who also consume fact-checks.
As Laura Hazard Owen, the deputy editor of the Nieman Journalism Lab, put it, “If you’re a person who has the NewsGuard extension on your browser, you are just not the problem.”
About 60,000 users have downloaded the NewsGuard extension so far, more than half of which came in the last month. This includes hundreds of libraries, meaning it could reach far more people. NewsGuard plans to make money by licensing its software to social media and search companies; it also has a separate product, BrandGuard, that it wants to offer to companies hoping not to have their ads placed next to fake news content.
If it’s going to have a chance at success, NewsGuard knows it needs to be part of the platforms on which people are sharing fake news in the first place: Facebook and Twitter feeds and Google searches.
“We’re asking the platforms to help to solve the problem that they probably inadvertently created,” NewsGuard co-founder and co-CEO Steve Brill told Bloomberg this summer. “And we’re asking them to pay significantly less than they’re paying their public relations firms or their lobbyists to talk about how hard the problem is to solve.”
Although Facebook has been loath to consider itself a media company or to decide on its own what is and isn’t news, it has been happy to have third parties shoulder that burden. Facebook recently parted ways with Snopes but has 39 other third-party fact-checkers it is working with globally, four of which — FactCheck.org, Lead Stories, PolitiFact, and the Associated Press (whose partnership is “currently in discussions,” according to Facebook) — are US-based.
So far, Facebook as well as the other major news distribution platforms Google and Twitter, have not signed on with NewsGuard, but NewsGuard says it is in continued talks with “all of the major tech companies.”
Gordon Crovitz, NewsGuard’s other founder/CEO, told Recode, “We’re actually very optimistic that there are going to be technology companies beyond Microsoft that do license our ratings and our write-ups.”
Crovitz doesn’t think tech companies could build a product like NewsGuard themselves.
“It’s a very open journalistic approach, and I think that does not come naturally to the Silicon Valley companies — it’s not their business,” he said. “We are journalists applying a journalistic approach to a journalistic problem.”
NewsGuard’s success is also dependent on how people perceive journalists. Facebook itself used to employ journalist news curators who would determine whether trending stories were legitimate. Facebook laid off those curators after they came under fire for having a liberal bias.
Why would these journalists or the perception of NewsGuard’s journalists be different?
“We operate in full transparency. Our nutrition labels explain why every site gets green or red. We publish the process. We publish the bios of the people who do the work. We engage with publishers,” Crovitz said. “I think giving readers context about each website will help establish trust.”
What the experts say
Jonathan Anzalone, assistant director and lecturer at Stony Brook University’s Center for News Literacy, says that tools like NewsGuard can help, but he thinks they’re not enough to fix the fake news epidemic.
“Our problems are much larger than identifying a bogus website or a piece of propaganda,” he told Recode. “It’s like we just invented fire, and we’re trying to learn how to deal with it. It will take more than a few tools, as valuable as they may be.”
Instead, he recommends “large-scale educational intervention” and “constant reinforcement in class and at home that we have to be critical of the news.”
“I would hope a red or green mark is a starting point and not a stopping point,” he added.
Nieman Lab’s Owen also thinks the problem of fake news goes beyond identifying good news sources.
“People don’t only share fake news because they think it’s real — they have all sorts of reasons for sharing biased coverage,” she said. “The human psychology that drives people to share stuff doesn’t fit within a browser extension.”
In order for NewsGuard to work, Owen said, “You have to be that specific kind of person who is open to these ratings and cares about them and is going to listen to them.”
Instead, she said, it would be best if platforms didn’t host fake news in the first place.
Ethan Porter, an assistant professor at George Washington University’s School of Media and Public Affairs, said, “I’m skeptical that a green check alone would do much, but I won’t rule it out.”
From his research, Porter believes that fact-checking could help reverse the damage of fake news by improving the “accuracy of respondents’ factual beliefs.” In other words, empirical evidence trumps fake news. NewsGuard includes empirical information about news sites but not about individual articles or claims.
Crucially, however, Porter’s research shows that fact-checking can make a reader more factually accurate but that it won’t affect their political ideologies or partisanship.
NewsGuard is one of many necessary efforts to combat fake news, according to Nancy Watzman, director of strategic initiatives at the journalism consulting firm Dot Connector Studio and outreach editor for the Knight Commission on Trust, Media, and Democracy.
“I think everyone is still trying to figure out what will be the best strategy,” Watzman told Recode. “But it is super important to know where information is coming from.”
Other efforts she thinks could work include teaching people how to think about news and what are good behaviors surrounding news reading. “There is increasing analytic thinking, and at the same time, there’s good hygiene,” she said. “It’s like putting on a seat belt or brushing your teeth — making these things a habit.”
NewsGuard, for its part, doesn’t think it’s the only solution either.
“We’re not going to solve every issue,” Crovitz said. Instead, he sees NewsGuard as “amplifiers” of fact-checkers’ work, which it cites in its reports. “We are another layer of protection in addition to whatever else the Silicon Valley companies are doing.”
Where NewsGuard stands
So far, NewsGuard has had tiny wins.
It has made 500 news websites update their standards, which presumably will lead to less fake news in the first place. It’s currently recruiting a team similar in size to its US one for four locations in Europe it plans to launch those in April. It has signed on with Microsoft’s Edge browser — one of the smallest mobile browsers by market share — but will need to be on more widely used Microsoft products like Bing in order to scale.
NewsGuard’s use of experienced journalists will give it credibility among some, but skepticism from others.
The experts we spoke with didn’t seem to think that NewsGuard on its own could be effective at halting fake news, but rather thought of it as one tool to be used in conjunction with government action, media literacy campaigns, and efforts by the tech platforms that host fake news.
In the meantime, NewsGuard will have to overcome some big distribution hurdles before we get to find out if it can work on its own.