There's been some good reporting on this: start with Dana Goldstein and Stephanie Simon, two journalists who regularly stop and think for a minute before they write. Would that all education writers follow the same advice; they might realize that...
- Effect sizes matter. The study shows a difference of .13 standard deviations in high school math and .06 standard deviations in middle school math. That is equivalent to moving a student from the 27th percentile to the 30th. The study uses the "x months of learning" conversion to say that's 2.6 months of learning; I find that to be misleading at best. Having four TFA teachers in a row isn't going to mean that a student will take calculus in their senior year instead of pre-calculus: these effects aren't necessarily cumulative.
What really should be reported is how many more questions the TFA students got right on the final tests. And it can't be many when an effect size is that small. Statistically significant? Sure. But practically significant? Come on. And I'm not even going to deal with the scaling problems here...
In addition, TFAers come from elite schools, which means they generally have good test-taking abilities. Are they able to impart those abilities on to their students? And is that "real" learning, or simply learning how to beat the system?
In any case, the results here seem like a great candidate for a treatment from the Mountain-Out-Of-A-Molehill-INATOR: there's just not that much here to get excited about.
- Some randomness isn't full randomness. OK, so the kids are randomly assigned -- but the teachers weren't. The TFAers aren't going up against the entire population of "regular" math teachers; they are matched only with those teachers who:
- Teach the same subject,
- In the same school,
- To which only TFAers are assigned.
That is a very, very limited control group from which to draw broad conclusions about the effectiveness of TFA. There is plenty of reason to suspect that many districts do not distribute teachers across their schools equally; and that's not even addressing the distribution between districts.
So the students may be randomly assigned, but the teachers in the control group most certainly are not. That is a big hit to the generalizability of this study. And speaking of those teachers...
- Heterogeneity of the control group. The study tries to account for some differences between the control group teachers, but it is a very limited description. That's not a criticism; it's simply pointing out the limitations of the study. Colleges are reported dichotomously as "selective" or "not selective," as if there isn't a whole world of difference between programs in "not selective" colleges (for example: state schools with scholars as faculty vs. crappy, on-line, for-profit "universities"). Interestingly, TFA teachers were more likely to have degrees in secondary math education than comparison teachers (18.8% vs. 15.9%).
I also found this telling: 70 percent of both the TFAers and the comparison teachers had 20 or fewer days of student teaching in math. That says as much, to me, about the quality of training of the comparison teachers as it does about the TFA group.
Again, I just don't have the time to look at this deeply (besides, there are people who vet these things really well, like NEPC -- I'll wait for their update). But even a cursory look says that the policy implications of this study are limited.
Were I a principal at one of these schools, would I want to give TFAers preference in making my hiring decisions? Would I want the power to be able to fire my veterans and bring in TFAers? Based on the small effect size, I'd be far more concerned about how changing personnel policies to give TFA preference would affect the qualifications of the rest of my hiring pool and the morale of my current staff.
So maybe there's another question worth asking: is staffing the schools that TFA serves with inexperienced, barely trained neophytes who provide only marginally more value than current staff really the best we can do for the deserving students who attend these schools?
In any case, my advice to those who are leaping to praise TFA on the basis of this study is to calm down. TFA is not working miracles, it's not a viable long-term solution, and its role in staffing charter schools is causing some very serious issues for urban school districts. Further, it's morphing into a political organization, and its role in big money urban renewal deals can't possibly be justified by a study that is so lacking in generalizability.
TFA is very good at tooting their own horn. But this is a little tune with a limited range; they're going to have to do better to justify the outsized place they taken in the education "reform" debate.
ADDING: JVH is on the case:
And so on. My fav part?