A very useful research report has recently been published by the Department for Education https://www.education.gov.uk/publications/RSG/AllPublications/Page1/DFE-RR157.
Researchers from the universities of Warwick and East Anglia looked in depth at the recommendations contained in 20 Serious Case Reviews. They found that these reviews contained more than 900 recommendations, an average of 47 (yes, 47) per review! And most recommendations were concerned with issues of procedures and training. The researchers also found that there was a "proliferation of tasks to be followed through" resulting from "breaking down recommendations into achievable actions".
The researchers conclude:
"Local Safeguarding Children Boards need to take responsibility for curbing this self-perpetuating cycle of a proliferation of recommendations and tasks and allow themselves to consider other ways of learning from serious case reviews. Recommendations may not be the best way to learn from these cases." (page 2)These are very welcome words.
Back in February I criticised a Serious Case Review for making what seemed to me to be silly recommendations http://chrismillsblog.blogspot.com/2011/02/tragedy-of-alex-sutherland.html . Hospital staff in Manchester had not recognised that an alcoholic woman they were treating had the care of a child. So the SCR recommended that all patients attending hospitals in the area should be be asked routinely about the dependents that they are responsible for.
I call this 'knee jerk proceduralism' - if something wasn't done in a case that went badly wrong, then make it a procedural requirement that it is always done in all new cases. The logic is simple but it is seriously flawed. And it doesn't take much imagination to see where this approach takes us. As time goes by there are more and more things which have to be done to follow the procedures, which leaves less and less time to provide the service.
The key to learning in child protection - as in other safety critical spheres of activity - is analysis and understanding of what goes wrong and why. Sheila Fish, Eileen Munro and Sue Bairstow are wholly on the right lines in recommending what they call a 'systems approach' to undertaking SCRs - http://www.scie.org.uk/publications/reports/report19.asp . In particular they argue that the report should always go beyond the basic facts of a case to try to understand the differing views that different workers had at the time, with the aim of identifying "... underlying patterns of factors in the work environment that support good practice or create unsafe conditions in which poor practice is more likely".
But I remain sceptical that the findings of individual SCRs are likely to be sufficient in themselves to produce a critical understanding. Although case material is an essential part of the building blocks of an analytical approach to safety it is only through aggregation of a number of cases that a full understanding arises. What we need are SCRs which produce findings in ways which can be more easily aggregated at a national level so that over time a robust understanding of the causes of error emerges.
And meanwhile let's get away from mountains of recommendations which are more likely to obscure safe practice than to inform it.