Helping out @toryeducation

This morning @toryeducation asked me to say “something useful” about a piece of research they sent a link to.  I was surprised to find the piece is about physics lectures for 850 undergrads in a Canadian university. My blogging and tweeting is almost exclusively about secondary/FE education – not least because Higher Education policy is not a remit of the Department for Education – so this seemed a bit of a bizarre piece to be asked to comment on.

Still, I like education research. Currently I spend my days commenting on and synthesizing it, so I cast my eyes over the research and my views are below. I hope that @toryeducation is satisfied, but I remain surprised that they thought this paper to be ‘not trivial’. Perhaps it’s possible they do not have a subscription to the full magazine and so had not fully read it. Hopefully the below comments will help give a better understanding of the piece.

Improved Learning in a Large-Enrollment Physics Class – Deslauriers, Schelew & Weiman (It’s behind a paywall for most people sadly. Luckily I have a subscription)

“Sight Check” – i.e. What clues do I have for its validity before I start reading

– This paper is about a university physics lectures attended by large numbers of students. The likely relevance to secondary education policy is nil.
– It is in Science magazine. Science only peer reviews some articles, not all. This piece was in the ‘report’ and not ‘research’ section. I am not certain it circumvented peer-reviewing, but I anticipate this to be the case. Update: Report sections are peer-reviewed. This is good to know. Unfortunately because reports are only 2k words there is still a lot of information not available to the reader.
– There are two ‘associated letters’ highlighted in the sidebar of the full article (or below the article in the abstract-only version in the link above). When clicking on these, one is shown to be a letter from a Professor of Education from the University of Birmingham (UK), and the second one is a multi-author letter from Biology departments at several US universities. Both highlight significant design flaws: high attrition rates, treating the experiment as a randomized control trial when it was a quasi-experiment, the use of a single teacher in one condition but two teachers in the other condition, and an absence of validity and reliability checks.  While one can always pick holes in scientific research having so many problems makes me sceptical as I go into reading this article. The authors will need to do a really good job of explaining these issues (spoiler: they don’t).
– Finally, the report is only 2 pages long. To accept research as useful for policy I would usually want more detail than a 2 page report can give.

The Actual Report

Putting aside my concerns for now, let’s see what the paper says:

–  In our studies of two full sessions of an advanced quantum mechanics class taught either by traditional or by interactive learning style, students in the interactive section showed improved learning, but both sections, interactive and traditional, showed similar retention of learning 6 to 18 months later (10).” < So the argument is that interactive learning gives you a more improved ‘immediate’ sense of learning but traditional lecturing and interactive learning both have similar levels of retention later on.
– “The control group was lectured by a motivated faculty member with high student evaluations and many years of experience teaching this course. The experimental group was taught by a postdoctoral fellow using instruction based on research on learning” < How do we know that any variation isn’t down to the fact that two different people delivered the groups? Given they are quite different – in level of responsibility, experience, possibly age – there are confounding variables here that are not adequately dealt with in the later data analysis.
– In the experimental group, the students were “making and testing predictions and arguments about the relevant topics, solving problems, and critiquing their own reasoning and that of others”… “as the students work through these tasks, they receive feedback from fellow students” < A lot of this sounds like assessment for learning already embedded in most schools throughout the 2000s so there is no relevance to education policy except carrying on!
“We incorporate multiple “best instructional practices,” but we believe the educational benefit does not come primarily from any particular practice but rather from the integration into the overall deliberate practice framework” < This a bit worrying, it sounds like the instructor just did whatever they thought was best practice. Without more detail there is nothing here that could be repeated in education policy in the UK (HE or otherwise)
– (!) Something useful < I had suspected there would be some use of clickers in this article, and they are there. Clicker use is common in the US and is something that I think could be used fruitfully in the UK although we have a horrible history of ICT implementation. Hence the only ‘useful’ thing for the UK so far in this piece is that Clickers are something individual schools might wish to invest in (particularly among 16-19 yr olds)
– Figure 1 shows the test was out of 12 < The validity of checking knowledge with only 12 response items is likely to be very low.
 “The result is likely to generalize to a variety of postsecondary courses” < The final statement from the author is that this is a postsecondary piece. I therefore maintain my further claim that this research is largely irrelevant to UK education policy.

 What can I say that is “useful”?

  1. The authors do not think this is generalisable to teaching in schools. There is no evidence to suggest that it is generalisable to UK school policy. I therefore reassert that it is irrelevant.
  2. There is no ‘method’ tested here shown to be ‘better’ for learning however it would appear that (a) the use of clickers, and (b) more ‘working together’ on physics problems influences short-term learning as tested in a 12-mark multiple-choice test.
  3. The research design is not well explained given the 2k words, the experimental condition is vague and unrepeatable, and there are no published validity or reliability checks. Without further information I would not use the piece to inform UK HE education policy either.

One thought on “Helping out @toryeducation

  1. Well said. You have certainly said ‘something useful’ about this dubious piece of research – though we can doubt whether it is what @toryeducation wanted to hear. They have, of course, just had a bad press! Keep up the good work. Have you seen my attempt to identify the educational ideology of Michael Gove? It’s at

Comments are closed.