A useful conversation in Bounded Trust by Scott Alexander. He is exploring how we can hone into true information when we know there are so many sources of error, bias, and deception in the generation and distribution of knowledge. Some good insight.
It doesn't matter at all that FOX is biased. You could argue that "FOX wants to fan fear of Islamic terrorism, so it's in their self-interest to make up cases of Islamic terrorism that don't exist". Or "FOX is against gun control, so if it was a white gun owner who did this shooting they would want to change the identity so it sounded like a Saudi terrorist". But those sound like crazy conspiracy theories. Even FOX's worst enemies don't accuse them of doing things like this.It's not quite that this would be *worse* than anything FOX has ever done. I assume FOX helped spread the story that Saddam Hussein was connected to 9-11 and had WMDs, just like everyone else. That's probably a bigger lie (in some sense) than one extra mass shooting in a country with dozens of them, or changing the name and ethnicity of a perpetrator. Certainly it did more damage. But that's not the point. The point is, there are rules to the "being a biased media source" game. There are lines you can cross, and all that will happen is a bunch of people who complain about you all the time anyway will complain about you more. And there are other lines you don't cross, or else you'll be the center of a giant scandal and maybe get shut down. I don't want to claim those lines are objectively reasonable. But we all know where they are. And so we all trust a report on FOX about a mass shooting, even if we hate FOX in general.In a world where FOX was the only news source available, this kind of thing would become really important. People would need to understand that FOX was biased while also basically being able to accept most things that it said. If people went too far overboard and stopped trusting FOX just because it was biased, they might end up in a state of total paralysis, unable to confirm really basic facts about the world.
He's not picking on Fox, simply using them as a specific for any media news purveyor. They all have their biases, we get to know those biases and we adjust accordingly.
Alexander then goes on to extend his argument to an arena in which we invest greater trust than we invest in journalism.
As in journalism, so in science.
According to this news site, some Swedish researchers were trying to gather crime statistics. They collated a bunch of things about different crimes and - without it being a particular focus of their study - one of the pieces of information was immigration status, and they found that immigrants were responsible for a disproportionately high amount of some crimes in Sweden.The Swedish establishment brought scientific misconduct cases against the researchers (one of whom is himself "of immigrant background"). The first count was not asking permission to include ethnicity statistics in their research (even though the statistics were publicly accessible, apparently Swedish researchers have to get permission to use publicly accessible data). The second count was not being able to justify how their research would “reduce exclusion and improve integration.”While these accusations are probably true on their own terms, I think any researcher who found that immigrants were great would not have the technicalities of their research subjected to this level of scrutiny, and that the permissioning system evolved partly out of a desire to be able to crush researchers in exactly these kinds of situations. I think this is a pretty common scenario, and part of a whole structure of norms and regulations that makes sure experts only produce research that favors one side of the political spectrum. So I think the outrage is justified, this is exactly what people mean when they accuse experts of being biased, and those accusations are completely true.
The point being that in all walks of life there are biases and our problems arise not primarily from the existence of those biases but from our inability to understand, interpret and screen those biases.
I am reading The Bias that Divides Us by Keith E. Stanovich which deals with similar matters.
In further writings on the idea that some beliefs can become convictions, Robert Abelson (1986; see also Abelson and Prentice 1989) makes the distinction between what he calls “testable beliefs” and “distal beliefs.” Testable beliefs are closely tied to the real world and the words we use to describe that world (e.g., Roses are red). They can be verified by observations—sometimes easily made personal observations, but other times observations that rely on the expertise of others and the more sophisticated methods of science. In contrast, distal beliefs cannot be directly verified by experience, nor can they be easily confirmed by turning to experts or scientific consensus. For example, you may think that pharmaceutical companies make excessive profits, or that your state should spend more on mental health and less on green initiatives. Certainly, economic statistics and public policy facts might condition distal beliefs such as these (either strengthening or weakening your attachment to them), but they cannot verify your distal beliefs in the same manner that they can verify your testable ones. Many distal beliefs embody our values. When they do, they are apt to become convictions because they will lead to emotional commitment and ego preoccupation, as argued by Abelson (1988). Distal beliefs often derive from our general worldviews or, in politics, from our ideologies.Myside bias centers on distal beliefs, not testable ones.
There's the rub. The distinction between testable and distal beliefs and the transition between them. A transition which starts with definitions and measures, transits via the usefully true into the arena of the balance of probabilities, arriving, if possible, at the known truth as defined by always true within useful confidence levels.
No comments:
Post a Comment