Recent examples include The power of storytelling versus the power of the story, The data doesn't support our argument so we will bait and switch our terms, Literary cognitive disonance or oversight?, Cognitive pollution galore, and It simply fails to remove the plausibility of that hypothesis. That's all in the past week or so.
The most recent expose - Nick Brown Smelled Bull by Vinnie Rotondaro. The article is in regard to positive psychology and posits that people with a positivity ratio greater than 2.9013 are destined to flourish, below that ratio, to fail.
The theory was well credentialed. Now cited in academic journals over 350 times, it was first put forth in a 2005 paper by Barbara Fredrickson, a luminary of the positive psychology movement, and Marcial Losada, a Chilean management consultant, and published in the American Psychologist, the flagship peer-reviewed journal of the largest organization of psychologists in the U.S.The debunking of the research on which the theory was constructed is laid out in The Complex Dynamics of Wishful Thinking:
The Critical Positivity Ratio by Nicholas J. L. Brown, Alan D. Sokal, and Harris L. Friedman. The abstract goes to the heart of the issue.
We examine critically the claims made by Fredrickson and Losada (2005) concerning the construct known as the “positivity ratio.” We find no theoretical or empirical justification for the use of differential equations drawn from fluid dynamics, a subfield of physics, to describe changes in human emotions over time; furthermore, we demonstrate that the purported application of these equations contains numerous fundamental conceptual and mathematical errors. The lack of relevance of these equations and their incorrect application lead us to conclude that Fredrickson and Losada’s claim to have demonstrated the existence of a critical minimum positivity ratio of 2.9013 is entirely unfounded. More generally, we urge future researchers to exercise caution in the use of advanced mathematical tools such as nonlinear dynamics and in particular to verify that the elementary conditions for their valid application have been met.So a fairytale theory with no supporting evidence is cited 350 times in the field as a foundational paper.
I know these guys are specialized practitioners in their fields, that they have spent far more time thinking about their subject and their experiments than have I, so when I see something that appears to be so slipshod, I have to ask, what am I missing? It appears that, way too often, I am not missing something. Our system of peer review and public critique of ideas is simply not functioning as it should.
Richard Feynman spoke of this issue where people generate incorrect information (cognitive pollution) in order to bolster their own belief sets or increase their status or attract grant funding, all regardless of scientific integrity. His term was Cargo Cult Science. From Cargo Cult Science by Richard Feynman.
I think the educational and psychological studies I mentioned are examples of what I would like to call cargo cult science. In the South Seas there is a cargo cult of people. During the war they saw airplanes with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head to headphones and bars of bamboo sticking out like antennas--he's the controller--and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land.
Now it behooves me, of course, to tell you what they're missing. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones. But there is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school--we never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It's a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty--a kind of leaning over backwards. For example, if you're doing an experiment, you should report everything that you think might make it invalid--not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked--to make sure the other fellow can tell they have been eliminated.
Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can--if you know anything at all wrong, or possibly wrong--to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.
No comments:
Post a Comment