Tuesday, July 29, 2014

Sins of information omission


From Kids Count 25th Edition, 2014 Data Book from The Annie E. Casey Foundation. I am looking for empirical data about what works and what doesn't with regard to children and reading. With reports such as these you always have to be careful because, as well intended as they are, they are usually fundamentally dishonest. They have an agenda they are pushing and they bend the data to support that agenda.

Because they are basically altruistic, they tend to look at the benefits of any given action and never the costs, an approach which makes for good press but which doesn't work in the real world of constraints and limits.

I approach the report with some due skepticism, prepared to find the gold amongst the dross. I am somewhat taken aback to have the concern so immediately confirmed. I turn to page 26, Education, to look for any information related to reading.

The opening three paragraphs are:
High-quality preschool matters, which is good news for the 50,000 low-income New Jersey children who benefit each year from a state-funded effort. In 1999, the state began enrolling 3- and 4-year-olds in high-quality preschool across the state's highest poverty districts. The program now serves about 80 percent of preschool-aged children in those districts.

A recent evaluation found that by fifth grade, children who attended the state program for two years were, on average, nearly a year ahead of students who had not enrolled in the program. These positive effects were considerably larger than those found in programs with less funding. Small classes, well-trained teachers, a curriculum with high standards and support services for children and families contributed to this program's success.

Advocates for Children of New Jersey played a key role in bringing early care and learning advocates together to develop a mixed-delivery system that improved the quality of community-based child care centers, while utilizing some public school classrooms. The organization led a coalition of early childhood stakeholders who successfully forced the state to require that preschool teachers have a bachelor's degree and receive the resource to acquire the necessary education. Those benefits to teachers are giving children a good start.
Sounds pretty good.

But is it? My sense of concern is triggered on two counts. First is that this is a result quite inconsistent with the repeated studies of Head Start, the federal program with similar structure and goals as New Jersey's. Several times in the past fifteen years, Head Start reviews have come back with the finding that while there are positive improvements while a child is enrolled in the program, those improvements have completely disappeared within two years. Ten billion dollars spent a year with no gain. But perhaps, New Jersey is implementing their program more rigorously.

That's possible but that brings us to the second trigger. Lots of words in the description but it is missing three crucial elements: 1) How much does it cost, 2) What is the goal, and 3) How effective is the program in achieving the goal? No measurements appear.

They are claiming that there are measurable sustained benefits accruing to children five years after the pre-school investment. If true, that is a real accomplishment. But what about the costs and goals?

How much does the Abbott preschool program cost? Googling turns up a number of documents that seem to indicate that the current cost to the state is roughly $12,000 per student. With some 50,000 enrolled, therefore, the program costs $600 million a year.

What is their goal? I haven't found a succinct and firm statement of goals. Most of the formulations are centered around "close the achievement gap." There are two problems with this formulation. Close can either mean narrow the gap or it can mean eliminate the gap. Under one reading, any improvement, no matter how narrow, would signal achievement of the goal. Since that would be fairly trivial, I am going to work with the second interpretation which is that the Abbott program will eliminate the performance gap between the advantaged and the disadvantaged.

How effective is the program in closing the gap? You have to go to the Abbott Preschool Program Longitudinal Effects Study: Fifth Grade Follow-Up to find that answer and you have too read pretty closely even then. Two caveats though. The first caveat is that the program only enrolls 80% of the targeted disadvantaged population. Would the achieved results be the same if the entire population was enrolled? Second, they did not measure the same population between preschool and fifth grade. They were only able to positively identify about 65% of the initial group five years later. They assume that the missing 35% had the same performance attributes as the located 65% but it is easy to conjecture why that might not be true. So what are the results?
The gains from two years [of participation in the Abbott Program] are equivalent to 20 to 40 percent of the achievement gap.
For $600 million, New Jersey is buying an average closure of 30% of the achievement gap for 52% of its disadvantaged students (65% x 80%).

So there are (at least) three questions. 1) How much of the improvement would still exist if you were able to measure 100% of participants instead of only 65% and all students not just the 80% enrolled? 2) Is it worth $600 million to close the gap only 30% by fifth grade? and 3) Will there still be measurable benefits by graduation and if so, how much?

The last is the real rub. If you invest $24,000 in a child over two years, you will only obtain positive benefit if they are likelier to graduate, more likely to attain higher education, more likely to be employed and more likely to be employed in higher compensated professions than they would otherwise have been. That there is still some measurable benefit at fifth grade is a good thing not achieved in any other similar scaled program of which I am aware.

But we still don't know if it will make a difference in the long run.

All of which is to say that the three paragraph summary in the report sins by omitting critical information that provides a substantially different interpretation of whether and to what degree the program has been successful.


No comments:

Post a Comment