Our results suggest that in the first two years of its implementation the program had no effects on math and reading scores.Interesting but that is pretty consistent with most field observation results I have seen. Technology qua technology makes no difference. It depends on who uses it, when they use it, how they use it, for what purpose they use it, etc.
What caught my eye was an almost throwaway line, my emphasis added:
The zero effect could be explained by the fact that the program did not involve compulsory teacher training and that laptops in class are mainly used to search for information on the internet.Untrained teachers, sure I understand that effect.
The implications of the fact that the computers are used primarily for searching for on-line information is interesting. What this seems to imply is that better access to more information does not in fact improve either reading or math scores. Reading is primarily a learned skill, thinking quantitatively is a learned skill and test taking is a learned skill. None-the-less, I would have been fairly confident that better access to more information would have shown up in the capability scores as at least a marginal contributor to outcomes.
But it does not. Access to information has no measurable predictive value on performance outcomes.
Could be a fluke but it is worth maintaining awareness of that implication.
No comments:
Post a Comment