Continuing with the idea from a recent post attempting to get some idea of how well states are performing academically given their various demographic profiles, on sagacious advice I included math and reading results in the fourth and eighth grades from the NAEP of 2003 and 2007. Instead of simply looking at point changes (both the math and reading tests are based on a 500 point scale), I gauged average state scores in terms of standard deviations (normally distributed). I also looked at the same for whites exclusively.
For all students, 'improvement' (how a state's fourth graders fare in comparison to its eighth graders, with better relative performance in eighth grade seen as indicative of teaching effectiveness over time, and poorer relative performance showing a 'negative improvement', or deterioration) is pretty consistent. There is a correlation of .57 for the improvement of the '03 and '07 cohorts.
The four-year time interval also allows fourth graders in '03 to be tracked in '07 when the same kids are in eighth grade. It's not a perfect trace, as some families move across state lines and others elect to enter private school at some point during those years. But it still provides a nice proxy.
The improvement of the eighth grade class of '07 relative to how they did as fourth graders in '03 correlates with the improvement of the fourth and eighth grade classes of '03 at a solid .71. It would be optimal to have data from 1999 to see how the eighth grade class of '03 improved over time (they are not available), but using the '03 numbers of eighth graders still provides a good estimate, given the firm relationships mentioned previously. With this consistency, the class of '07 seems an appropriate measure to use in ranking improvement by state over time.
There does not appear to be a trade-off in improving math scores at the cost of reading scores, or viceversa. The improvement for the two subject areas correlate positively at .64 in '07 and .61 in '03.
Curiously, the same relationships hold when only white students are considered, but they are moderately less rigorous. The correlation of the '03 and '07 cohorts is .53 (compared to .57 for all students), while the improvement of the class of '07 relative to the '03 eighth graders correlates at .67 (to .71 for all). The improvement of the all-races class of '07 doesn't correlate with a state's racial composition at any level of statistical significance (the p-value is .27).
The improvement of the '07 class of eighth graders (all races), by state, in standard deviations, follows. Keep in mind that it is a state's improvement in performance relative to other states for its eighth graders in '07 compared to its fourth graders in '03 that forms the rankings. We're essentially looking at a state's rate of self-improvement over the middle four years of schooling:
|1.||District of Columbia||.66|
The upper Midwest (excepting Michigan) does pretty well, although the trend is not overwhelming. The overseas schools that teach military brats also look good. Not much of a pattern emerges, though. The public school bodies of DC and Massachusetts are about as demographically and economically dissimilar as it gets, yet they occupy the top two spots.
This lends some credence to the idea that the differences are, broadly, a result of what goes on in the classroom. The differences in improvement, pretty consistent over at least the last eight years, are not meaningfully tied to demographic composition or affluence (whereas absolute performance very clearly is).
Whatever the driver, three putatively 'crucial' attributes--the student-teacher ratio, expenditures per student, and average teacher salaries (even after making cost-of-living adjustments for the latter two)--do not correlate with improvement in any meaningful way (.09, .05, .03, respectively, all without statistical significance). To the extent that a miniscule relationship does exist, the trends for all three are in the expected directions (lower student-teacher ratio and more money for students and teachers all weakly correlate with greater improvement).
It would be interesting to line up the results against metrics for difficulty in obtaining a teaching certificate (something like this, but quantifiable) and the vigor with which teacher performance is evaluated. John Stossel, in his special Stupid in America, for example, singled out New York City's school system (New York is near the bottom of the list) as being particularly absurd, refusing to fire even the most incompetent or disturbed educators (if memory serves, only two had been let go over the time period he looked at). Massachusetts, in contrast, has come under fire recently for its rigorous licensing requirements, for which over half of aspiring black and Hispanic teachers fail to make the grade.
Is there a known source for that kind of information? Do other quantifiable attributes that might relate to better educational methods come to mind?