I am deeply involved in the science and art of decision-making and do a lot of research in this area including the prevalence of logical fallacies, psychological biases, and other cognitive kinks in our capacity to identify patterns, assess the quality of information, estimate risk, and make effective decisions that will move us towards our desired goals. It is a fascinating field.
An observable phenomenon in the past decade has been the burgeoning plethora of books focused on the various mechanisms by which we reduce the quality of our decisions. It almost feels like there is a conspiracy to not only reduce our confidence in the effectiveness of our decision-making but indeed to advance the notion that we cannot make good decisions. There are many reasons for this radical proposition. I suspect one significant contributor is a certain defensiveness on the part of experts.
With the increasing capacity of the internet to help people gather information and learn on their own, it has undermined the status and economic livelihood of experts. I suspect that the messaging that people are not to be trusted to make decisions on their own is an uncoordinated and unconscious effort to shore up the position of experts. Something along the lines of "You may have read twelve articles on on the internet on this topic but don't try it at home, call the expert."
Regardless of the cause, I think the pendulum has swung too far. Yes people should be realistic and cautious about the quality of their thinking and decision-making, and yes there is variance between people in the quality of their thinking and decision-making, but they are still perfectly capable. Experts are useful and necessary, but not to everyone, and not all the time.
Mercier makes this point in his research. From the abstract:
A long tradition of scholarship, from ancient Greece to Marxism or some contemporary social psychology, portrays humans as strongly gullible—wont to accept harmful messages by being unduly deferent. However, if humans are reasonably well adapted, they should not be strongly gullible: they should be vigilant toward communicated information. Evidence from experimental psychology reveals that humans are equipped with well-functioning mechanisms of epistemic vigilance. They check the plausibility of messages against their background beliefs, calibrate their trust as a function of the source’s competence and benevolence, and critically evaluate arguments offered to them. Even if humans are equipped with well-functioning mechanisms of epistemic vigilance, an adaptive lag might render them gullible in the face of new challenges, from clever marketing to omnipresent propaganda. I review evidence from different cultural domains often taken as proof of strong gullibility: religion, demagoguery, propaganda, political campaigns, advertising, erroneous medical beliefs, and rumors. Converging evidence reveals that communication is much less influential than often believed—that religious proselytizing, propaganda, advertising, and so forth are generally not very effective at changing people’s minds. Beliefs that lead to costly behavior are even less likely to be accepted. Finally, it is also argued that most cases of acceptance of misguided communicated information do not stem from undue deference, but from a fit between the communicated information and the audience’s preexisting beliefs.We are all free agents who can make our own decisions.