Sunday, September 16, 2012

Subject to well-designed large-scale replications, those promising signs attenuate and often evaporate altogether

From Response: Weighing the Evidence by Charles Murray.

Murray hones in on the overwhelming tendency to focus on whether a study produces the result that one is interested in rather than focusing on what ought to the first order of business - to what degree does the study follow rigorous rules of design, execution and evaluation which would make the outcomes reliable rather than simply indicative. Too many studies are mere whimsies that are essentially cognitive vaporware - sturm and drang but no reliable substance.
Toward the end of his career, sociologist Peter Rossi, a dedicated progressive and the nation’s leading expert on social program evaluation from the 1960s through the 1980s, summarized his encyclopedic knowledge of the evaluation literature with his “metallic laws.” Rossi’s iron law was that “the expected value of any net impact assessment of any large scale social program is zero.” His stainless steel law was that “the better designed the impact assessment of a social program, the more likely is the resulting estimate of net impact to be zero.” To me, the experience of early childhood intervention programs follows the familiar, discouraging pattern that led him to formulate his laws: small-scale experimental efforts staffed by highly motivated people show effects. When they are subject to well-designed large-scale replications, those promising signs attenuate and often evaporate altogether.

No comments:

Post a Comment