There's a growing recognition in the UK that teaching needs to be a research-literate profession. When teachers use evidence from education research it helps them innovate and overcome barriers to progression and attainment.
At the same time, neuroscientists have raised concerns over the spread of ‘neuromyths’ – misconceptions about the mind and brain that are often used to justify ineffective approaches to teaching.
To address this, we partnered with the Education Endowment Foundation (EEF) on an Education and Neuroscience funding programme. We wanted educators and neuroscientists to work together to develop evidence-based classroom interventions, or to test existing tools and programmes that could then be scaled up.
The programme ran between 2014 and 2019, and we supported six projects. The projects have been externally evaluated to look at their impact (impact evaluation), implementation and feasibility (process evaluation).
One of our key learnings was the importance of involving teachers in all phases of an educational neuroscience intervention. Only one of the six projects did this, and it was the most successful.
Spaced Learning was co-designed and delivery-led by teachers working for the Hallam Teaching School Alliance (TSA). It aimed to improve GCSE outcomes by applying the approach of spaced learning – that information is more easily learnt when it is repeated on multiple occasions, with time passing between the repetitions.
The project involved a small randomised controlled trial (RCT) led by the project’s evaluator, The Centre for Evidence and Social Innovation (CESI) at Queen’s University Belfast (QUB). The project tested different approaches to delivering spaced learning in science lessons. It provided some evidence that the most promising approach to integrate spaced learning was using both 10-minute and 24-hour spaces between teaching science content.
Both teachers and pupils enjoyed and engaged with the programme. Because the programme was co-designed by teachers, it fitted into teachers’ normal practice and didn't interfere with their teaching. Most teachers delivered the intervention as prescribed and didn’t need support beyond the initial training.
Since the project finished, Hallam TSA and QUB CESI are working together on the programme, now called SMART Spaces. The programme is currently being trialled at a bigger scale with over 14,000 pupils participating.
The other five projects were developed and delivered by teams of academics. Of these, one showed positive results.
Stop and Think: Learning Counterintuitive Concepts developed a computer-based learning activity that used methods to improve pupils’ ability to adapt to counterintuitive concepts. An example of such a concept is that children might make the mistake of thinking that -5 is larger than -1. Year 3 (aged 7-8) and Year 5 pupils (aged 9-10) were trained to inhibit their initial response and give a slower and more reflective answer.
Pupils who participated in the programme made the equivalent of +1 additional month’s progress in maths and +2 additional months’ progress in science, on average, compared to children in the lessons-as-usual control group. It should be noted that the maths result is not statistically significant.
But although teachers mostly stayed true to the intervention design, they did report problems. These included issues with the software, low quality animation, some content being too easy and repetitive (leading to low pupil engagement) and finding it difficult to fit the intervention into a busy timetable. For these reasons, the majority of teachers did not endorse rolling out the intervention to other schools.
So even though the project was successful in the sense that it was implemented with fidelity and showed positive outcomes, closer collaboration with teachers is needed to make sure that an intervention is feasible and then endorsed by them to be rolled out to other schools.
The EEF are now working with the Stop and Think project team to make changes based on teacher feedback and potentially test the approach in more schools.
The other projects – Fit to Study, Teensleep and Sci-napse: Engaging the Brain’s Reward System – all faced issues during implementation. And GraphoGame Rime showed no measurable effects when compared to business as usual. This was a valuable finding, because it shows that schools should be cautious about claims made for this particular intervention and should not expect to see large effects.
A key lesson from the history of RCTs is to embrace zero or negative findings in the same way we embrace positive ones.
Overall, the results of the Education and Neuroscience funding programme highlight the need for genuine research-practice partnerships, where teachers can provide a reality-check about their classrooms when interventions are designed. Prescriptive interventions designed by researchers alone run the risk of facing implementation issues, high student attrition rates and lack of teacher support for further roll-out.
As with all things however, there is a balance to be struck. If interventions are not sufficiently different to usual practice they may not make a difference to student outcomes. The key is to ensure that the programme is feasible for teachers to deliver and that there is enough training and support to enable them to adapt as necessary.
Our example from Spaced Learning illustrates what is possible when teachers and researchers work together to co-design an intervention. An educational idea or intervention may be great in principle, but as a recent EEF guidance report suggests, 'what really matters is how it manifests itself in the day-to-day work of people in schools.'