Skip to content

Jo Boaler

September 19, 2015

Another wrong (about randomisation) educationalist

boaler-800x0-c-default

Jo Boaler, author, Stanford professor (click here), and founder of the educational website YouCubed (click here) is visiting the UK to persuade schools to take up her maths teaching ideas (click here). She objects to rote learning of times tables. According to the Times Education Supplement (click here) she has said:

“Governments saying everybody has to memorise their times tables to 12 times 12 is absolutely disastrous.”

Blimey!  But she has her reasons. She believes forcing weak children to learn tables makes them anxious about maths in general.

“What we know now is that when you give things to kids like a timed multiplication test, about a third of them develop anxiety. For those kids the working memory which holds maths facts is blocked and they can’t access it.”

“Some kids aren’t fast memorisers,and they decide from an early age that they can’t do maths because of the timed maths tests.”

Even bright kids are harmed:

“Other kids may be OK but see maths as a shallow subject which is about recall of facts and disengage. So [tables cause] huge damage”.

It all sounds plausible, her webpages cite whole libraries of academic papers, and she is obviously a charismatic educationalist. Judging by her twitter feed @joboaler many teachers adore her.

But others argue that learning tables is a vital early step in getting comfortable with mathematics. Click here for one.  They also have theories and academic papers in support.

The research cited by each side is impenetrable to anyone uncommitted to the argument the author is advocating, and I’m certainly not qualified to judge it.

But I am qualified to say that in an area like maths teaching where factors like innate ability, teacher enthusiasm and parental engagement almost certainly influence results, the only reliable way to judge who is right, is a randomised controlled trial. But Jo Boaler cites none, and Google can’t find any.

I am also qualified to state that Jo Boaler doesn’t understand the limitations of observational data and the need for randomised trials in education. In a paper (click here) critiquing a US National Mathematics Advisory Panel that had advocated teaching tables in 2008, she wrote,:

“When comparing teaching approaches to consider which is more effective, random or equal assignment may be thought of as presenting a research ideal. If students are assigned to random or equal groups and given different treatments, and one treatment results in better outcomes, then researchers have a strong case for making causal statements. Experiments such as these have emanated from medical research, and they lend themselves to the controlled conditions of laboratories. However, when researching learning in complicated places such as schools, such models become highly impractical and, some would say, implausible.”

“Researchers in mathematics education do not need to assign students to groups in quasi-experimental studies, taking control of their education, as they can employ statistical methods to control for differences in student characteristics. Using logistic regression analysis, for example, researchers can control for factors such as prior mathematics achievement, gender, and socioeconomic status. It could be argued that researchers cannot control for every variable that may affect a student in a population, but they can control for all those known to be reasonable […]”

That’s wrong Jo Boaler. Other educationalists have expressed similar sentiments, and they are wrong too.  Not just a bit wrong, but absolutely 100% wrong. The exact opposite of correct.  Score gamma triple minus in the “education intervention evaluation exam”.  Go to the back of the class Jo Boaler.

It is the very complexity of education, the many unknown factors that influence outcomes, that justifies randomisation.  “Prior mathematics achievement, gender, and socioeconomic status” aren’t the problem. Jo Boaler is right about that; we can measure them and, at least in principle, control for them using logistic regression analysis.  But, by definition, no amount of fancy statistics can ever control for unknown factors; they are unknown factors. The only way to have any assurance that they are more or less equal between the two groups under study is to allocate the students at random.

This misunderstanding about randomisation causes trouble in two ways.  If Jo Boaler is wrong, her ideas are condemning thousands, maybe millions, of children to not learn their tables, and a lifetime of innumeracy.

But what if she is right? In that case her failure to test her ideas against well-conducted randomised trials allows governments all over the world to continue forcing children to learn their tables by rote and condemn even more to a lifetime of fear of mathematics.

What a pity no-one taught Jo Boaler how to evaluate educational interventions properly.

Jim Thornton

One Comment leave one →
  1. September 20, 2015 3:20 am

    Perhaps Boaler’s delusion about easily controlling for confounding factors in educational studies is what led her to select quite nearly the worst-performing schools in all of UK for her famous/notorious “Phoenix Park” experiment upon which much of her early work is based and, again, the very bottom of the barrel among schools In California for her equally notorious “Railside” experiment upon which the bulk of the remainder of her work depends. If you can control for any and all variables, then … why NOT pick complete basket cases as control and treatment groups for your studies? This is like picking kids from the cancer ward to study the dietary effects limiting carbs for middle-age men … don’t worry, we can always use logistic regression to control for the effects of cancer and age … Can we? And … are these the only factors?

    This particular blind spot seems peculiar to a certain type of educational thinker—Kami and Dominick’s poorly designed but unfortunately influential study purporting to show that teaching algorithms to children in early years harms their understanding of arithmetic drew its test groups from a certain spectacularly low-performing elementary school.

    A related cautionary note needs to be sounded when it comes to interaction between factors. Not long ago the OECD published a summary of their correlation studies on autonomy and accountability (by measures defined in that study) of schools. It was found that increased autonomy — by itself — correlates to a weak negative effect on student achievement. Increased accountability, on the other hand, correlates to a weak positive effect. Surprisingly, however, it was found that increases in BOTH variables (i.e. simultaneously) correlates to a STRONG positive effect. I have not seen any commonly used regression analysis that is calibrated to pick up on such multi-variable effects in which one variable reverses the direction of action of another.

    Another problem with conflating variables is Leibig’s Law of the Minimum. One may have a marvellously powerful factor to test but if the test group is subject to the action of a factor preventing the effect from manifesting any results obtained will be useless.

    The following piece of juvenile humour may serve to illustrate:

    A scientist places a healthy bullfrog on the floor of his lab. “Jump, froggy, jump!”, he instructs. The frog jumps. The scientist measures how far and writes on his clipboard “A healthy frog can jump 2 meters.”

    The scientist then cuts off one of the frog’s front legs (c’mon, you heard me say “juvenile”. You should have seen that coming!), places the frog on the floor, “Jump froggy, jump!”. And the frog does. The scientist writes, “A frog with only 3 legs can jump 1 meter.”

    The scientist cuts off another leg (I’m guessing you saw it coming this time). “Jump, froggy, jump!” He writes, “A frog with only 2 legs can jump 0.4 meters.”

    Another leg. “Jump, froggy, jump!” … “A frog with only 1 leg can jump 0.15 meters.”

    Finally, the last leg. “Jump, froggy, jump!”. The limbless amphibian just lays there on the floor. The scientist writes in his clipboard, “A frog with no legs is deaf”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: