Thursday, December 19, 2019

Beliefs that function as badges of group membership are inherently biased towards implausibility and absurdity precisely because out-group members have no incentive to hold such beliefs

From Socially Adaptive Belief by Daniel Williams. From the Abstract.
Abstract. I outline and defend the hypothesis that human belief formation is sensitive to social rewards and punishments, such that beliefs are sometimes formed based on unconscious expectations of their likely effects on other agents—agents who frequently reward us when we hold ungrounded beliefs and punish us when we hold reasonable ones. After clarifying this phenomenon and distinguishing it from other sources of bias in the psychological literature, I argue that the hypothesis is plausible on theoretical grounds: in a species with substantial social scrutiny of beliefs, forming beliefs in a way that is sensitive to their likely effects on other agents leads to practical success. I then show how the hypothesis accommodates and unifies a range of psychological phenomena, including confabulation and rationalisation, positive illusions, and identity protective cognition.
The passage that especially caught my eye, cited from another source, is
As per the discussion in Section 3 above, Kahan points out that engaging in IPC (Identity Protective Cognition) is often perfectly (practically) rational: given that individuals have a negligible impact on the phenomena that they form beliefs about in this area, they have little practical incentive to believe what is true; given the high levels of social scrutiny of such beliefs, they have a strong practical incentive to believe what signals their group membership. As Kahan (2017b, p.1) puts it,
“Far from evincing irrationality, this pattern of reasoning [i.e. IPC] promotes the interests of individual members of the public, who have a bigger personal stake in fitting in with important affinity groups than in forming correct perceptions of scientific evidence” (my emphasis).
Although Kahan’s research is on a very specific topic, the basic logic of IPC generalises to any case in which beliefs that are not best licensed by the available evidence become strongly associated with desirable coalitions of various kinds. Under such conditions, an individual’s group attachments clash with the aim of truth and thereby undermine the link between practical success and epistemic rationality. This clash between group identity and epistemic rationality has long been recognised. Writing of his experiences in the Spanish civil war, for example, Orwell (1968, p.252) famously noted that “everyone believes in the atrocities of the enemy and disbelieves in those of his own side, without ever bothering to examine the evidence.” This observation was experimentally vindicated in the 1950s when one of the earliest studies on motivated cognition demonstrated that students at Dartmouth and Princeton overwhelmingly reported more infractions by the other side in a penalty-filled football match between their universities (Hastorf and Cantril 1954).

Kahan’s research raises the question of how beliefs become strongly associated with certain coalitions to begin with. There are likely many routes by which this occurs, including deliberate efforts by those with vested interests in creating the association. One interesting suggestion in this area, however, is that beliefs that function as badges of group membership are inherently biased towards implausibility and absurdity precisely because out-group members have no incentive to hold such beliefs, thereby ensuring that they function more effectively to differentiate in-group members from outsiders (Tooby 2017; see also Simler and Hanson 2017, p.279).

The relationship between beliefs and loyalty identified by IPC plausibly extends beyond the examples just described. In totalitarian regimes, for example, people are harshly punished if evidence comes to light that they do not subscribe to the regimes’ myths, generating a powerful incentive to seek out and process information to jettison the truth in favour of beliefs that signal their loyalty. As Hannah Arendt (1953, p.474) remarked, “The ideal subject of totalitarian rule is not the convinced Nazi or the dedicated communist, but people for whom the distinction between fact and fiction, true and false no longer exists.” Ordinary life is replete with more prosaic examples of this conflict between loyalty and epistemic rationality. We often expect our friends and family to take our side on factual disputes involving others, for example, even though our side invariably constitutes a self-serving interpretation of those facts (Simler and Hanson 2017, p.75).
Think of some of the more extreme, and data insupportable, claims of AGW, gender wage inequality, income inequality, rape culture, Black Lives Matter, SLPC etc. and the discussion takes on particular salience.

No comments:

Post a Comment