A quick note on the super awesome state report card put out yesterday by Michelle Rhee and the folks at
With no states receiving an A, two states receiving B-minuses and 11 states branded with an F, StudentsFirst would seem to be building a reputation as a harsh grader.
Ms. Rhee said that the relatively weak showing reflected how recently statehouses had begun to address issues like tenure and performance evaluations. “We didn’t say in any way that we want to show people how bad it is,” she said in a telephone interview. “We wanted to show the progress that is being made, but in places where progress is slower to come, be very clear with leaders of that state what they could do to push the agenda forward and create a better environment in which educators, parents and kids can operate.”
The two highest-ranking states, Florida and Louisiana, received B-minus ratings. The states that were given F’s included Alabama, California, Iowa and New Hampshire. New Jersey and New York received D grades, and Connecticut a D-plus. The ratings, which focused purely on state laws and policies, did not take into account student test scores. [emphasis mine]Considering how much stock SF puts in evaluating teachers by using standardized tests, you would think they'd be all for evaluating state policies by using standardized tests (which is much more in line with what the tests were designed to do in the first place).
But this brings up an interesting question: is there a correlation between getting a "good grade" from StudentsFirst and having students who perform well on national exams? Because that would be the ultimate test as to whether or not SF's report has any merit whatsoever: if the policies they like don't lead to high-performing students, what's the point?
Here, then, is a comparison of the ranking SF gave each state (and D.C) from 1 to 51, plotted against the ranking of each state on the latest National Assessment of Educational Progress (NAEP) for 8th Grade mathematics:
I can't label every data point as a state (long story), but I marked a few. See how Vermont, Montana, and North Dakota are in the top ten for math scores, but at the bottom of SF's rankings? See how Louisiana, Florida, and DC are at the top of SF's rankings, but at the bottom on NAEP scores? See how the top three states on 8th grade math - Massachusetts, Minnesota, and New Jersey (yeah!) - are spread out among SF's rankings?
Yes, there are states like Colorado, Utah, and West Virginia where the SF ranking is close to the NAEP ranking. But you'd expect to find a few of those even if there wasn't a correlation. Which is the point:
There is no relationship between SF's state rankings and rankings on national standardized tests.
Why, then, would anyone think that the prescriptions SF and Rhee are pushing would make any difference in getting students to "achieve"? Why are they pushing policies that have nothing to do with student outcomes?
Why is StudentsFirst pushing policies that have nothing to do with student achievement?
Uh, can I get back to you on that?
ADDING: It seems ridiculous to treat SF's "scores" for the states as any sort of precise measure. But, just for kicks, let's look at raw scores on the 8th Grade NAEP math test against SF's "GPA" for each state:
There's no relationship between the two. And I feel silly that I'm treating their "GPA" like it's a quantifiable measure of something meaningful. And even sillier that I put that trend line in there.
ADDING MORE: Here's Doug Henwood traveling down the same path, getting the same results I did.
UPDATE: Seems like everyone is hitting on this theme:
Laura Clawson at DKos
Jon Pelto, of course.
Leonie Haimson, naturally.
G.F. Brandenburg, who knows Rhee as well as anyone.
And, of course, Diane Ravitch.
And I hear there are more in the works...