Friday, January 31, 2014

Cognitive Pollution arising from a Spinozan system of assent

There are two separate points of interest here. From Bias, Assent, and the Psychological Plausibility of Rational Irrationality by Bryan Caplan.

One of the things I believe is critical, and that we currently do somewhat poorly, is to integrate information across fields of study as well between past and present. We rush forwards and rarely integrate. I am all the time discovering a key study in an area to which I have devoted a great deal of thought and research. And then there will be something absolutely pertinent, with great information and it is off in some remotely affiliated field of study having almost no natural connection to the topic, or it was published twenty years ago. Or both.

That is what has happened to Caplan. Caplan has just published The Myth of the Rational Voter, and the paper to which he alludes, Mental Contamination and Mental Correction: Unwanted Influences on Judgments and evaluations by Timothy D. Wilson and Nancy Brekke, is highly pertinent. But he has only come across it post-publication. And it is from twenty years ago.

Better late than never. There is so much cognitive gold out there waiting to be discovered.

The second point of interest is the argument advanced by Wilson and Brekke. One of my recurrent themes is the bain of what I have been calling cognitive pollution. Acquired knowledge that has never been examined and yet is wrong or misunderstood in some material way. Without examination, that unexamined knowledge simply complicates all later decisions that might depend on it in some fashion.

Wilson and Brekke use the term mental contamination for a slightly different phenomenon. Here is their descriptive passage from their paper.
As noted by Gilbert (1991, 1993), there is a long tradition in philosophy and psychology, originating with Descartes, that assumes that belief formation is a two-step process: First people comprehend a proposition (e.g., "Jason is dishonest") and then freely decide whether to accept it as true (e.g., whether it fits with other information they know about Jason). Thus, there is no danger to encountering potentially false information because people can always weed out the truth from the fiction, discarding those propositions that do not hold up under scrutiny. Gilbert (1991, 1993) argued persuasively, however, that human belief formation operates much more like a system advocated by Spinoza. According to this view, people initially accept as true every proposition they comprehend and then decide whether to "unbelieve" it or not. Thus, in the example just provided, people assume that Jason is dishonest as soon as they hear this proposition, reversing this opinion if it is inconsistent with the facts.

Under many circumstances, the Cartesian and Spinozan systems end up at the same state of belief (e.g., that Jason is honest because, on reflection, people know that there is no evidence that he is dishonest). Because the second, verification stage requires mental effort, however, there are conditions under which the two systems result in very different states of belief. If people are tired or otherwise occupied, they may never move beyond the first stage of the process. In the Cartesian system, the person would remain in a state of suspended belief (e.g., "Is Jason dishonest? I will reserve judgment until I have time to analyze the facts"). In the Spinozan system, the person remains in the initial stage of acceptance, believing the initial proposition. Gilbert has provided evidence, in several intriguing experiments, for the Spinozan view: When people's cognitive capacity is taxed, they have difficulty rejecting false propositions (see Gilbert, 1991, 1993).
This jumps out at me in part because it reflects a common experience I have had in arguing with people convinced about a particular proposition which they have never examined and which is simply factually not true and yet which, when confronted with a discussion that forces examination, they simply throw up their hands. Just yesterday I had someone tell me, paraphrasing, "Your data might be right but I don't have the energy or motivation to argue. We'll have to agree to disagree."

In that particular conversation, the argument had been advanced that "I have always attributed this to the fact that men coming home from the war discovered women quite competently taking their place in factories and workplaces. This meant that women had to be shamed into a retreat from the larger world. Women were, by and large, so glad to have their men back home that they were complicit in the shaming and the retreat." To which I responded with BLS data showing that women in fact had not retreated from the workforce, that there had been a multi-decadal increase in female labor force participation rate from 1900 through the 1990s, that that LFPR had increased during the decade of the 1940s including the war years. There was a two decade long drop at the end of the war from 1946 to 1960 for one age cohort (20-24) reflecting the baby boom but that all other age cohorts increased LFPR year-by-year through the nineties and even the 20-24 cohort resumed their increasing LFPR from 1960 onwards. So the central argument that women had retreated from the labor force was demonstrably wrong. But as quoted above, that didn't matter because the refutation contradicted a preferred, but unexamined, belief.

Here is the model Wilson and Brekke discuss regarding mental contamination and mental correction (click to enlarge). If the default mode is the acceptance of a proposition without examination and it requires cognitive effort to examine and assess the accepted proposition, and if the proposition is pleasingly consistent with other existing unexamined assumptions, then, given the number of steps and effort there is, it is easy to see why there is so much cognitive pollution out there.

Thursday, January 30, 2014

An idea so crazy it just might work!

Really just posting because such a great headline. Small, New University Does Something Radical -- Only Hires Professors Who Want To Teach And Only Admits Students Who Want To Learn by George Leef which Glenn Reynolds at Instapundit tags as "An idea so crazy it just might work!"

This is interesting though. I was recently talking with a person about the future sales and marketing needs of institutions of higher learning.
Naturally, UMR isn’t for everyone. The attrition rate for the first class (only 57 students) was nearly 25 percent. Too much work (most students report that they devote at least 35 hours per week to their studies, outside of class) and too little fun for quite a few.

At that point, most college administrators would have started thinking, “How can we change the school to retain more students?” Instead, Chancellor Lehmkuhle decided to improve the school’s marketing to the kind of student who’d be a good fit for the serious intellectual environment. That’s apparently working and UMR has grown to 475 students.

Wednesday, January 29, 2014

Humanity and its cultural constructs are more enigmatic than much of the natural world

From Are There 'Laws' in Social Science? by Ross Pomeroy. I woke up this morning trying to arrive at definitions of when we know something to be true (also definitions of when we know something to be useful). I don't know where he gets it, but I like Pomeroy's quoted definition of a scientific law.
A scientific law is "a statement based on repeated experimental observations that describes some aspect of the world. A scientific law always applies under the same conditions, and implies that there is a causal relationship involving its elements."
A scientific law is both testable and can be used to forecast, two critical elements often absent from the domain of the social sicences.

At the end of the article.
The reason why social science and its purveyors often gets such a bad rap has less to do with the rigor of their methods and more to do with the perplexity of their subject matter. Humanity and its cultural constructs are more enigmatic than much of the natural world. Even Feynman recognized this. "Social problems are very much harder than scientific ones," he noted. Social science itself may be an enterprise doomed, not necessarily to fail, just to never fully succeed. Utilizing science to study something inherently unscientific is a tricky business.
I'd argue that it gets a bad rap because its subject is indeed much more complex and yet it uses much less rigorous methods despite the greater need for them. The hard sciences have disgracefully high levels of research retraction or non-replication, but that is essentially a consequence and cost of exploring the frontier of knowledge. As bad as that situation is, the social sciences are far worse - few studies, rarely rigorous and with fixed opinions unsupported at all or supported with inconclusive and badly designed studies.

Mexico as an explanation of charter schools

This is interesting. From Pritchett on Private vs. Government Schools by Bryan Caplan.

The original source to Caplan's blog post is a new book, The Rebirth of Education by Lant Pritchett. Two observations from Pritchett regarding education experience in the global context.
This isn't to say that across the board, private schooling is better than that available in government-run schools; in general, the evidence that private schools outperform government schools in well-functioning systems is weak. In the United States, where there has been the opportunity to do the most rigorous experimental studies, most researchers agree that the private sector edge in learning is nothing like a full effect size [1 standard deviation], almost certainly not even a tenth of an effect size, and some legitimately dispute whether the private sector causal impact is even positive.

[snip]

Broader than just the success of specific interventions inside government schools is the observation that even in low-performing government systems one finds excellent schools, but also, even nearby and even operating under apparently exactly the same conditions, terrible schools... The problem is not that government schools cannot succeed, for in nearly all developing countries some of the very best schools are government schools. The problem is, as the LEAPS study authors emphasize, "when government schools fail, they fail completely"...
Caplan then points out the experience of Mexico.
Case in point: In Mexico, "essentially all of the weakest-performing schools - those more than 100 points [2 standard deviations] below the average - are government schools."
I wonder if this data doesn't suggest the answer to the paradoxical experience of charter schools in the US. Charter schools work within the confines of the public school board but are given significant autonomy. They usually have to take all that apply or use a lottery system, i.e. they are not allowed to select students.

Charter schools have been the great hope for cracking the demonstrable failure of many, usually urban, school systems. Lots of good will and good faith efforts and there are some very distinctive successes. But if you take all the charter schools together as a population, not just cherry pick the successes, then the results are much more mixed. But parents still love them (and government and teachers unions still hate them). Why?

My speculation had been along two lines of thought. 1) Parents love them because of the opportunity for greater involvement and influence than is usually possible in a bureaucratic and anonymous city system, and 2) charters are still operated within the broad guidelines and requirements of the school board - they have some latitude but not complete freedom, consequently there are likely some set limits to how much improvement might be possible.

Pritchett's work suggests an alternate or additional explanation. Perhaps the issue is not so much about average performance levels but rather of the standard deviations. I would wager that charter schools, like the private schools in Mexico, have a smaller standard deviation than the public schools. While the charter schools might not have much of an advantage on terms of the average of results that they do achieve, what they secure against is the risk of catastrophic failure (as exemplified in No books, no clue at city's worst school by Susan Edelman or the systemic teacher cheating scandals in Atlanta, Washington, D.C., Philadelphia and elsewhere).

So if your goal is to ensure that your child gets a good education, the average scores aren't all that different between public, charter and private schools. But if your goal is to ensure that your child avoids getting no education, then your choices are private and charter. Hence the sustained support for charters even though the academic results are only marginally better. Privates and Charters simply have a smaller measured standard deviation and that in itself can be immensely valuable.

The corollary question is: why do private and charter schools have a smaller standard deviation? I am guessing that it is a combination of greater accountability, greater transparency, greater competition and greater customer choice (a form of accountability). These are standard market place disciplines which are frequently absent from government systems. The tertiary implication is that public schools don't necessarily need charter schools so much as they need greater transparency, accountability, competition and choice.



The quintessentially unscientific attitude of regarding those who question the ideology as enemies to be suppressed

In Praise of Passivity by Michael Huemer - Well worth reading.

Huemer opens with a powerful example of his thesis.
In 1799, America’s first President, George Washington, fell ill with what is now thought to have been an infection of the epiglottis in his throat, a rare but serious condition that can lead to blockage of the airway and eventual suffocation.1 His good friend and personal physician attended him, along with two consulting physicians. Medicines and poultices were tried, along with five separate episodes of bloodletting that together removed over half of Washington’s blood. As one contemporaneous account explained, “The proper remedies were administered, but without producing their healing effects.” The former President died shortly thereafter. Needless to say, his treatment either had no effect or actually hastened the end.

Washington’s doctors were respected experts, and they applied standard medical procedures. Why were they unable to help him? Put simply, they could not help because they had no idea what they were doing. The human body is an extremely complex mechanism. To repair it generally requires a detailed and precise understanding of that mechanism and of the nature of the disorder afflicting it–knowledge that no one at the time possessed. Without such understanding, almost any significant intervention in the body will be harmful.
And concludes just as powerfully.
Popular wisdom often praises those who get involved in politics, who vote in democratic elections, fight for a cause they believe in, and try to make the world a better place. We tend to assume that such individuals are moved by high ideals and that, when they change the world, it is usually for the better.

The clear evidence of human ignorance and irrationality in the political arena poses a challenge to the popular wisdom. Lacking awareness of basic facts of their political systems, to say nothing of the more sophisticated knowledge that would be needed to reliably resolve controversial political issues, most citizens can do no more than guess when they enter the voting booth. Far from being a civic duty, the attempt to influence public policy through such arbitrary guesses is unjust and socially irresponsible. Nor have we any good reason to think political activists or political leaders to be any more reliable in arriving at correct positions on controversial issues; those who are most politically active are often the most ideologically biased, and may therefore be even less reliable than the average person at identifying political truths. In most cases, therefore, political activists and leaders act irresponsibly and unjustly when they attempt to impose their solutions to social problems on the rest of society.

Perhaps the most dramatic example is that of Karl Marx, who famously commented that “The philosophers have only interpreted the world, in various ways; the point, however, is to change it.”[24, p. 145] Marx’s greatest legacy is the practical demonstration, through twentieth-century history, of the consequences of changing a world that one does not understand. This is not the place to detail his misunderstandings, which have been discussed at great length by others. Let it suffice to say that despite the seriousness with which generations of intellectuals around the world have studied his works, Karl Marx’s understanding of human beings and of society was minimal. His influence on the twentieth century world, however, was unparalleled–and, as most observers now recognize, almost unbelievably malignant. This is no mere accident. When one lacks a precise and detailed understanding of a complex system, any attempt to radically improve that system is more likely to disrupt the things that are working well than it is to repair the system’s imperfections. Marx’s failure to improve society should have been about as surprising as the failure of George Washington’s doctors to cure his infection by draining his blood.

Perhaps, one may hope, human beings will one day attain a scientific understanding of society comparable to the modern scientific understanding of most aspects of the natural world. On that day, we may find ways of restructuring society to the benefit of all. But we cannot now predict what that understanding will look like, nor should we attempt to implement the policies that we guess will one day be proven to be beneficial. In the meantime, we can anticipate many pretenders to scientific accounts of society, after the style of Marxism. These will be theories resting on dubious premises that only certain political ideologues find convincing. These ideologues may, as in the case of the Marxists, adopt the quintessentially unscientific attitude of regarding those who question the ideology as enemies to be suppressed.

Political leaders, voters, and activists are well-advised to follow the dictum, often applied to medicine, to “first, do no harm.” A plausible rule of thumb, to guard us against doing harm as a result of overconfident ideological beliefs, is that one should not forcibly impose requirements or restrictions on others unless the value of those requirements or restrictions is essentially uncontroversial among the community of experts in conditions of free and open debate. Of course, even an expert consensus may be wrong, but this rule of thumb may be the best that such fallible beings as ourselves can devise.
And there is substance and value in virtually every paragraph in between.

Tuesday, January 28, 2014

The bear was identified as an honest American species

An incident recounted in a book review, Dropping the Screwdriver by Alex Goodall in the December 2013/January 2014 edition of the Literary Review. The subject book is a history of the many brushes with operational disaster during the nuclear cold war, Command and Control by Eric Schlosser. The reviewers incident:
I can vividly remember, as an undergraduate in the late 1990s, my introduction to the history of nuclear geopolitics. While working my way through a pile of texts on the Cuban Missile Crisis, I began compiling a list of misunderstandings and mistakes that could have led to accidental nuclear conflagration had things turned out differently. Although not the most serious incident, one that sticks in my mind involved a black bear that stumbled onto an air defence command post in Duluth, Minnesota. A guard saw a shadowy figure attempting to climb the security fence, shot it, then activated a intruder alarm. Due to the wrong alarm being activated at nearby Volk Field Air Base, this caused an order to be issued to scramble nuclear-armed F-106A interceptors to repel a Soviet attack. Fortunately, at the last moment, the bear was identified as an honest American species and the order rescinded.

So much for the presumption of innocence

I am researching the variant approaches in different fields (medical, judicial, etc.) in determining what constitutes sufficient evidence to draw a conclusion. In that process I came across this rather disturbing piece, Burden of Proof as a Legal Fiction: One Year Later by J. Bennett Allen.

The substance of the article is anchored on the following graph. This maps three things, 1) The beyond a reasonable doubt standard, the assessed quality of evidence by jurors and judges, and 3) the conviction rates.

Yikes! Even where the evidence strongly favors the defense, 30% of the time both the judge and jury conclude that the accused is innocent. It rises to 60% conviction rate when the evidence is balanced between defense and prosecution. So much for the presumption of innocence. One has to fervently hope there is some methodological fault in the study.

The total of stranger homicides could thus be anywhere from 29 to 211

Numbers are always a proxy for reality and you have to be careful not to lose sight of that distinction. All too often journalists fall prey to simplistic readings. From “Of the 334 murders in New York City in 2013, it appears only 29 victims did not know their killer” by Eugene Volokh.
So report some news outlets (e.g., CBS New York), quoting a New York Post story, which likewise makes a similar claim. But the key is in a sentence a bit lower down: “Not all the murder cases have been solved, though, so the number could go higher.”

Could it ever! According to the New York Post story that the CBS story cites, “Police solved 152 homicides in 2013,” out of a total of 334. That means 182 homicides weren’t accounted for, and the total of stranger homicides could thus be anywhere from 29 to 211 (29+182).

I suspect that stranger homicides are more common among the 182 unsolved homicides than they are among the 152 solved ones — in a non-stranger homicide, the killers tend to be easier to identify, precisely because they come from a pool of the victim’s family members, friends, and acquaintances (though note that, as the New York Post mentions, “[a]n acquaintance can include a rival gang member”). The total number of stranger homicides in New York City is thus likely to be a good deal higher to 29, and perhaps closer to 211.
Reading the headline, one is tempted to conclude that 91% (305/334) of murder victims knew their murderer. The reality is that the headline should have been that At Least 46% (152/334) of Victims Knew Their Murderer. That's a lot less dramatic than the original headline.

This has greater consequence than simply journalistic sloppiness. If it were true that 91% of murder victims knew their murderer, then it would imply that the city is really safe and you just have to pick your friends carefully (which would be true anyway). 46% of murder victims knowing their murderer reinforces that you have to be vigilant and cannot let your guard down with friends and strangers.

Is this simply an issue of low journalistic standards or is there a reason that the paper and TV want to convey a greater degree of security that actually exists. These days? Who knows.

UPDATE: On reflection, I don't think I elucidated the train of thought enough. The media elected to cast the story in a way that gave the impression that 91% of the time, murder victims know their murderer, whereas the actual numbers tell us that at best all we know is that at least 46% of the time, murder victims know their murderer. We should keep in mind that homicides are a vanishingly small cause of death in the US.

There is probably at least a 70% chance that this misrepresentation is a function of journalist and editorial innumeracy. It happens a lot. But take the alternate hypothesis, that it was deliberately misrepresented. What would explain that? Why would a journalist or editor wish to create the impression that danger comes from one's nearest and dearest rather than from strangers.

Here is my Just So story explanation. If your assailant is most often known to you, then your mitigating strategy is to better select and monitor your friends, family and acquaintances. This creates an illusion of control. In theory, it also creates a powerful community self-policing incentive. If the danger is from people you know, then keep an eye on them, watch out for their well-being. A nice story.

On the other hand, if the danger you face is largely anonymous and random, then you have different mitigating strategies which primarily entail aspects of self-defense such as physical training in a martial art, security systems and guns. Danger from strangers cultivates self-reliance.

Given the well documented political affiliations of journalists, it is easy to see that they are much more likely to wish to push the story that cultivates community rather than guns and self-reliance. So maybe that is the explanation. Sounds too nuanced to me though. I'll go with the 70% probability that it is simply innumeracy.

The need to talk about careful collection of all types of data

From Anecdotes and Simple Observations are Dangerous; Words and Narratives are Not by Heather Lanthor.
An excellent piece on this (though there are plenty of manuals on qualitative data collection and analysis) is by Lincoln and Guba. They suggest that ‘conventional’ rigor addresses internal validity (which they take as ‘truth value’), external validity, consistency/replicability and neutrality. (The extent to which quantitative research in the social sciences fulfils all these criteria is another debate for another time.) They highlight the concept of ‘trustworthiness’ – capturing credibility, transferrability, dependability and confirmability – as a counterpart to rigor in the quantitative social sciences. It’s a paper worth reading.
Plus purposeful pursuit of disconfirming evidence.

The author is arguing that qualitative evidence should be incorporated more systematically in pursuit of knowledge and truth. She doesn't shrink from the historical shortcomings and challenges associated with qualitative data.
Regardless of what types of data are being collected, representativeness is important to being able to accommodate messiness and heterogeneity. If a research team uses stratification along several to select its sample for quantitative data collection (or intends to look at specific heterogeneities/sub-groups for the analysis), it boggles my mind why those same criteria are not used to select participants for qualitative data. Why does representativeness so often get reduced to four focus groups among men and four among women? Equally puzzling, qualitative data are too often collected only in the ‘treated’ groups. Why does the counterfactual go out the window when we are discussing open-ended interview or textual data?

Similarly, qualitative work has a counterpart to statistical power and sample size considerations: saturation. Generally, when the researcher starts hearing the same answers over and over, saturation is ‘reached.’ A predetermined number of interviews or focus groups does not guarantee saturation. Research budgets and timetables that take qualitative work seriously should start to accommodate that reality. In addition, Lincoln and Guba suggest that length of engagement – with observations over time also enhancing representativeness – is critical to credibility. The nature of qualitative work, with more emphasis on simultaneous and iterative data collection and analysis can make use of that time to follow up on leads and insights revealed over the study period.
While the author sets out to advocate a particular position, include more qualitative information in experiments, her article is actually a reasonably good critique of all data collection (whether quantitative or qualitative) and how knowledge is short changed and misled by poorly designed experiments.
In general, if we want to talk about creating credible, causal narratives that can be believed and that can inform decision-making at one or more levels, we need to talk about (a) careful collection of all types of data and (b) getting better at rigorously analysing and incorporating qualitative data into the overall ‘narrative’ for triangulating towards the ‘hard’ truth, not equating qualitative data with anecdotes.

Monday, January 27, 2014

Yet international migrants' total share of world population has been relatively constant at 2 to 3 per cent for decades

From Entrance Strategies a review by Eric Kaufmann of a couple of books dealing with culture and emigration. From the December 2013/January 2014 edition of the Literary Review.
The cultural cords - language or religion, for instance - that bind people together over time and place are selected by each generation. The symbolic menu is potentially infinite. Yet choices are constrained by the dishes chosen by previous generations. It would take a hard, multigenerational slog for an elite to convince British people to adopt Taosim or the German language as symbols of their national identity - it's much easier to stick with Christianity and English. Immigration of enough determined German Taoists, though, especially if they were resistant to English charms, could bring change. It's happened before. As Eugene Kulischer has shown, in AD 900 Berlin had no Germans, Moscow no Russians, Budapest no Hungarians, Madrid was Moorish and Constantinople had few Turks. More recently, Israel, Lebanon and Kosovo furnish examples of how migration can drive political change.

The collision between immigration and national identity is defining our epoch. This will only accelerate in the decades to come. The developing world produces 97 per cent of world population growth and is in the early stages of its demographic transition. The rich world is ageing and declining in native population. The demographic difference will peak in 2050 as economies converge: poverty and excess births in one region; wealth and birth dearth in another. Economic theory would suggest population should flow from the poor tropics to the temperate zones. Yet international migrants' total share of world population has been relatively constant at 2 to 3 per cent for decades.