Monday, March 12, 2018

As is, though, it is simply more fake news.

From The spread of true and false news online by Soroush Vosoughi1, Deb Roy , and Sinan Aral. I must admit to deep skepticism when I first saw the headlines to this research. True and false are astonishingly challenging epistemic and philosophical issues. The whole issue of fake news is essentially a political posturing issue with little relationship to truth.

The authors of the paper acknowledge this. Their workaround is logical but flawed.
We sampled all rumor cascades investigated by six independent fact-checking organizations (snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com, and urbanlegends.about.com) by parsing the title, body, and verdict (true, false, or mixed) of each rumor investigation reported on their websites and automatically collecting the cascades corresponding to those rumors on Twitter.
They are using six fact checking sites as their arbiters of truth. I am not familiar with the last three but snopes.com, politifact.com, and factcheck.org all have highly variable track-records. They do often perform relatively solid work. However, they are all terribly susceptible to mainstream media memes and have some egregious examples where they twist themselves hard in order to not offend left-leaning sensibilities and, as frequently, go out of their way to nit-pick minor right-leaning points of view.

I think it might have been Politifact that first got my dander up when they, in all seriousness, fact-checked an opinion. Later, I came across an instance where they found the same fact as being false when the claim was made by a Republican but which they had rated as true when they had assessed it three years earlier by a Democrat. I outlined the extensive reasons to view the fact-checking sites as highly unreliable in this post, Quis custodiet ipsos custodes? - Fact checking the fact checkers. Additional posts on the issues of fact-checking and truth are here and here.

Further, there was this research which investigated the cross-reliability of Politifact and Fact Checker. If we are dealing with demonstrable, settled facts, one would expect a high degree of agreement between the two sites when they both investigate the same claim. In fact, the researcher found a dangerously low level of agreement.

Simply deferring to fact checking sites as your default for determining what is true or false is both negligent and unreliable.

On this basis alone, the researchers rather disqualify their own work. In all other regards, they seem to be working hard to be rigorous. From the Abstract:
We investigated the differential diffusion of all of the verified true and false news stories distributed on Twitter from 2006 to 2017. The data comprise ~126,000 stories tweeted by ~3 million people more than 4.5 million times. We classified news as true or false using information from six independent fact-checking organizations that exhibited 95 to 98% agreement on the classifications. Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news than for false news about terrorism, natural disasters, science, urban legends, or financial information. We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it.
If they could only fix that foundational issue of how to determine validity of claims, this would be very interesting research. As is, though, it is simply more fake news.

No comments:

Post a Comment