This week saw the launch of the RSA and EEF’s new ‘Learning About Culture’ programme. This is a two and half year investigation into the role that cultural learning plays in improving outcomes for children. Its intention is to develop more evidence of what works and support schools and cultural organisations to use that evidence to improve their own practice. It will engage 400 schools across the country in five projects using music, drama and illustration. The launch of the programme presents an opportunity for us to pause and reflect on cultural education in terms of epistemology (the philosophy of knowledge, enquiry and explanation), and ontology (how we as individuals believe the world to be).
In the arts, context is everything
The arts by their very nature are contextual and contingent, with meanings and values that are not universal and absolute, but instead are constructed by each individual. For me, this understanding of the world means that there isn’t one universal truth about the arts that is out there in the world waiting to be discovered. Instead the truth is different for each person depending on their context. As a result, a methodology for researching the impact of the arts would have to be qualitative, constructivist and inductive. Its purpose would be to see what’s out there, to create a hypothesis, rather than to test one.
What evidence exists?
This isn’t enough for policy makers who call for more evidence. Those of us engaged in the Creative Partnerships
programme will recall the rigor of an evaluation methodology that drew upon ethnographic research tools (e.g. the in-depth interview) and that ensured findings were valid by triangulating the viewpoints of children, teachers and artists. Countless reports from this programme demonstrate the overwhelming power of art and culture to affect positive outcomes for young people. Since each project is different - with different schools, art forms, children, artists and teachers - those findings aren’t expected to be generalisable to the whole population and, for policy makers, herein lies the problem.
Is the evidence robust?
Durham University’s 2015 review
of the impact of arts education concluded ‘though there are promising leads… there just isn’t enough robust evidence to be able to demonstrate a causal link between arts education and academic attainment’. For the authors, robust evidence has to compare an intervention group to a control group, and use randomised and sizeable samples to give more confidence that findings can be generalised.
In response, the Cultural Learning Alliance heroically set about marshalling the existing evidence to counter these conclusions in a report
that drew only upon studies with large sample sizes. The findings indicated a clear correlation between cultural education and academic attainment.
Evidence based practice
The RSA and EEF’s new programme looks to better fill this perceived evidence ‘gap’ with a research design that uses large sample sizes and control groups, namely the randomised controlled trial (RCT). Teachers will then draw on the findings to make better decisions about how to prioritise cultural education and how to design effective programmes. The evidence provided by the RCT should then give teachers confidence in the likelihood of an intervention that was effective elsewhere being equally successful in their own context.
The problem with randomised controlled trials
RCTs are commonly used in medical research and are well suited to that field of study, where cause and effect are more easily determined. In the Guardian, Marc Smith
argued however that, ‘Education outcomes, on the other hand, are not always as clear-cut and we are not always sure of what needs to be measured (or, indeed, what should be measured).’ He points to the plethora of ‘confounding variables’ in educational contexts.
Advocates of the RCT suggest the approach is robust because it is credible and scientific, but the contextual and contingent nature of the arts and education means ‘to suggest that RCTs are free of the issues that have undermined other types of research is to keep heads well and truly buried in the sand’ (ibid).
Using mixed methods
In its defence, it should be said that the design of the RSA and EEFs research project does use mixed methods. The RSA for example, will make qualitative deep-dive studies to explore what makes some schools successful in the delivery of an arts-rich curriculum. A toolkit will be produced to help educators better use the evidence to design their own cultural education programmes.
In addition, the University of Manchester has designed an intriguing system called SPECTRUM which will allow the programme to evaluate the impact of the arts on non-cognitive or ‘soft’ skills (and this is arguably the most promising vein of enquiry to tap). There is much to be excited about, not least that 9000 children are going to engage in rich cultural education programmes over the next two years.
Tracing cause and effect
I suppose for me, above all, I want to see that the campaign for cultural education is not harmed by the emphasis on randomised controlled trials as the only way to achieve robust evidence. As Smith concludes: ‘It is not… simply a matter of one intervention for one group, one for another and then measure the outcomes. Learners are not patients and their outcomes cannot always be measured in such a straightforward way.’