You can bet that whenever there is a slow news week, an article will turn up in the New York Times on some new educational “breakthrough” that will finally help all children learn math. This week’s entry into the “miraculous math” derby is…. drumroll please! A Better Way to Teach Math by David Bornstein. It alway helps to look at the credentials of the author of an article like this: Bornstein is one of the Times third-string writers: he covers everything from health care to bullying to how you can turn your rice husks into electrical power. Here he wears the hat of an expert on mathematics education.
Anybody who has seen the materials published by JumpMath will not see anything unfamiliar: it’s the same tired old stuff, repackaged into teenie-tiny units that are easily taught by teachers, and quickly and forgettably digested by curious children. You can download the fraction materials and see that there is nothing magical to this method: it uses the same models we’ve all seen before, including the ubiquitous pie divided into equal pieces. It re-hashes the same old rule-based procedural drill & kills, and carries on the passive math tradition that has not led to widespread numeracy since it was introduced a century ago. So why is it “working?”
There are several explanations for the so-called “improvements” that children make using JumpMath, which goes back to the old adage, “correlation does not imply causation.” The first is that any program which has a reputation that precede itself suffers from something called the “Hawthorne Effect.” That is, the program worked because the teachers were told it would work, and the children were no doubt aware that they were doing something that is supposed to be “different,” so they just worked harder. In essence, it wasn’t the program itself that caused the jump in improvement; it was the fact that the class was using any new program. A one year jump in test scores is not a “trend” by any means, just like a one day jump in the stock market is not a rally.
Second, this program requires teachers to undergo quite a bit of training to implement. They have to read the materials carefully, and no doubt undergo hours of outside training to implement it properly. If these teachers are getting ongoing support to make this program work, then the success of the program is more than likely due to the teachers getting support. Again, any program whose implementation includes extensive teacher training is bound to improve scores in the short run.
Finally, the study does not take into account any of the confounding variables. Perhaps this program requires more time on instruction in school? Maybe the homework assignments require longer periods of time than the old program? If a teacher spent 2 hours per week on math suddenly spent twice that amount of time on instruction, we should expect to see scores rise. The same is true of homework: perhaps this program demands more time, so that instead of doing 20 minutes of homework each night, students are doing more like 40 minutes or more? Thus, it wasn’t the methodology of the program or the uniqueness of the materials, it was merely students spending more time on instruction and practice.
Now, there is no doubt that some of the things mentioned in the article are true: math anxiety is truly an issue that holds children back from high levels of achievement, and practice definitely boosts performance. Having teachers who are more confident of their mathematics ability will certainly help the students with whom they work. But is this curriculum in and of itself a miracle? Sadly, no, because those of us who have been involved in math education know the simple truth: there is no such thing as a “miracle curriculum.”. Teachers teach students, not curriculum; a great teacher can easily overcome a lousy curriculum, and a lousy teacher can easily foul up a brilliant curriculum.
So let’s just get a grip. Oh, and let’s see how many days it will be before the Times runs yet another “math miracle” article.