At the turn of the 20th century, if a person lay dying in bed few options were open to the doctor charged with saving their life. Though stethoscopes captured some information about the heart and the internal organs that information was extremely limited. The introduction of electrocardiography machines (ECG) in 1904 meant the heart could suddenly be monitored easily and in-depth. Doctors could continually record the various rhythm patterns causing arrests or murmurs and in doing so gained a clearer picture of the interventions required to treat different types of heart stoppage. Myocardial infarctions (no blood getting to the heart) require a very different pattern of treatment to issues of Cardial Tamponade (fluid pressing on the heart) but where doctors previously intervened based only on what they could see and hear their interventions were far more hit and miss; and a miss, for a cardiac arrest patient, usually means death.
Subsequent technologies are equally remarkable in what they have enabled doctors to understand. Before the introduction of Ultrasounds in 1965 no-one saw a baby before its birth. The Thalidomide scandal would never have occurred if their commercial use had been introduced just one decade earlier. MRI scans – a spin-off from ultrasound technology – now means surgeon routinely provide brain surgery on an organ once considered entirely unfathomable.
But why does this matter for education? At present the classroom teacher is akin to the turn-of-the-century doctor. Though teachers can use their senses to see if students are becoming bored or perplexed – just as doctors could diagnose measles or coughs by sense – teachers are largely limited to what can be seen, heard or intuited. Sure, experience develops intuition – like any teacher, I now have a weird knack for knowing a fight will break out 10 seconds before it actually does – but I also believe the research that shows teachers only ever see 5-10% of happenings in their classroom. Other tasks divert our attention: taking registers, visitors at the door, dealing with new students, closing windows, fiddling with the computer. Attention can rarely be fully attuned to a class and when it is our senses can only register the expressions of 30 children for short periods of time before we are once again drawn back to the particular needs of the most demanding child.
Having more information about our students – how they learn and where they are struggling – would be useful in diagnosing misconceptions and enabling timely intervention. At school the Rosetta Stone computer programme is used with students arriving from abroad with little English language skill. The programme gathers information about their progress and adjusts tasks so it is just above their current ability. The EAL mentors can enforce a specified % of correctness before students move to the next task and by using the detailed dashboard of their progress in various skills (reading, speaking, spelling, etc) we can give personalised 1:1 follow-up or small group lessons focused on their greatest areas of need. Rosetta Stone acts as the ‘eyes of the whole classroom’ while we can focus on individuals as required.
Beyond the use of ‘computer games’ in school there is also potential in analysing the datatrails learners leave of their past behaviours. This isn’t advocating snooping on Facebook as a new form of pedagogy; it’s more about thinking through what technology can do to enhance teaching. Let’s go for a much more mundane example. Imagine: Every student in a class has a multiple choice pad – maybe on an ipad, mobile phone, clicker, whatever! – and at certain points throughout the lesson students answer multiple choice questions using their tool. Over the lesson the teacher can see if the % of correct answers is going up (a good sign!), or if patterns in understanding are emerging (Table 1 is acing the test; Table 3 not so much), or – and this is my favourite – once answered the technology could show a tick or cross enabling the teacher to re-arrange students so those with correct answers discuss their answer with the ‘incorrect’ students and then run a second task to see if things have now sunk in.
This is a tiny example of what becomes possible as technology provides instant detailed data on what is going on in learner’s minds. Below is a slideshare showing way more exciting ideas.
Inevitably I hear the doomsdayers tutting, so let me forestall a little: I KNOW THIS IS NOT NEW. In my first year of teaching we had weird multiple choice contraptions purporting to do something similar and they were rubbish. They were difficult to use and took ages to set up BUT at that time my mobile phone could barely muster a non-mono ringtone and the iPad was something akin to ‘futuristic magic’. The technology has to be correct but let’s not abandon the idea that it won’t be there at all as we move forward.
When digital technologies entered medicine many practitioners yelped fearing it would undermine their natural instincts and professional skill. When I look at the slideshare show I can see so much potential but it also makes me anxious as it’s human nature to be sceptical of change. But any feelings I have about how wonderful I am and that ‘intuition is enough’ disappears when I realise that doctors at the turn of the 20th century probably were really great at what they did and they managed to do a lot without machines, but heck am I grateful that those machines are there now. Medical technology helps doctors make better informed decisions. The machines do not make the decision; occasionally doctors use professional judgement and intuition before overriding the data, but having that accurate information doesn’t half help point action in the right direction. Our fear of being shown that experience alone is inadequate is not a good enough reason to throw away the possible future provided by Learning Analytics