It was a shocking example of human error. Naturally people were shaken. Naturally they asked how such a thing could have happened. Naturally they expected the authorities to take action to prevent a recurrence.
I remember the accident very well but I do not remember there being calls for a national debate on ways to improve air safety. I do not remember calls for a government-appointed investigation into human errors committed by pilots. I do not remember local MPs putting down an Early Day Motion in the House of Commons to debate the issue. People seemed content to leave matters to the Air Accident Investigation Branch and to the Civil Aviation Authority.
Move forward twenty four years and a few miles to the west, to the city of Coventry. It is another time, another place and another tragedy. A small Polish boy – Daniel Pelka - dies of neglect that has gone unrecognised by his teachers and health and social-care professionals. It is another a shocking example of human error. Naturally people are shaken. Naturally they ask how such a thing could have happened. Naturally they expect the authorities to take action to prevent a recurrence.
But the difference with this tragedy is that there are calls for “… a national debate into how to deliver a more joined-up approach between child protection agencies to prevent child deaths”. There are demands for a government-appointed investigation into “systematic failings” and ways of improving the system of information-sharing between agencies and social care practices. Local MPs have put down an Early Day Motion in the House of Commons to debate the issue. http://www.coventrytelegraph.net/news/coventry-news/coventry-mps-call-national-debate-6324038
Not only that but a campaign group has succeeded in getting tens of thousands of signatures to a petition calling for a significant change in the law which could result in seeing teachers, health and welfare professionals sent to gaol for failing to report child abuse and neglect. (http://www.bbc.co.uk/news/uk-england-24099797)
The difference between the two events – Kegworth and Daniel - continues to puzzle me. When it comes to air crashes we seem to be prepared to leave safety improvements to the experts. When the tragedy concerns child protection many of us have our own clear opinions about what went wrong and how it could be put right.
Perhaps it is because aeroplanes are technical sorts of things that most of us cannot even imagine controlling? We can’t say how we could have done it differently because we can’t understand how it was done in the first place.
Or, perhaps, it’s because we are all experts to some extent in child-care – either as parents or children or siblings or neighbours or friends - we all have experience of it one way or another?
Or perhaps it is because the media are more inclined to stoke up blame campaigns when it comes to the likes of social workers and teachers than they are when it comes to airline pilots? Who knows?
Or perhaps it’s just because we have allowed child protection to become a political football, while air safety is not.
The truth is that there really is very little difference between the two tragedies. In both cases professionals were trying to do their best in difficult circumstances. In both cases they misjudged the situation and so took decisions that turned out to be mistaken. In both cases communications were imperfect and misunderstandings arose. In both cases people were reluctant to suggest that a mistake may have been made – at Kegworth passengers and cabin crew could see smoke and flames coming from the left engine, but said nothing when the captain announced he had shut down the right engine.
I don’t think that we should have a ‘national debate’ about how to change systems and laws following the death of Daniel Pelka. We had exactly that kind of debate about what to do after Victoria Climbié died and it resulted in lots of strange laws and initiatives that ten years later seem to have been neither relevant nor effective nor wise. It is oh-so-easy to get out the old cigarette packet and start listing ‘reforms’ on the back – yes we’ll introduce this law or that, bring in some new regulations, create some new organisations and rename everything in sight. It might satisfy some politicians and spin-doctors but it won’t keep any child safer.
The reason that such grand designs are useless is because we – the great British public and our political leaders – know only broadly what happened to Daniel and we do not know why it happened. If we start our reforms from that point it is almost certain that we will reform the wrong thing, tackle the wrong problem and propose the wrong solutions.
In recent months I’ve heard lots of people complain about the use of the phrase ‘lessons will be learned’. I have some sympathy with them because to be frank I do not know myself what these lessons look like – and I have yet to meet someone who does. It is no good just saying that professionals mustn’t commit human error or threatening them with imprisonment or other punishment if they do. We need to get behind the human error to understand the root causes.
Following Kegworth and similar accidents, aviation professionals began to think much more clearly about human error. Because human error is inevitable, they realised that they have to learn on a daily basis about how errors are made and how they are best avoided or mitigated. They realised that an essential part of doing their jobs was understanding their own capacity to get it wrong.
Accordingly they developed training and practices influenced by the psychology of human error. These are now referred to as human factors. Today this kind of training is mandatory. Pilots, cabin crew and ground staff all regularly undertake courses in human factors. In their day-to-day work they practice skills and procedures to help them reduce error, such as de-briefing after a significant piece of work, to understand what went right and what went wrong and why? That helps them to understand what to do to avoid similar mistakes in the future and to get it right more often.
And national systems for collecting data about error have been established. Did you read recently about two pilots who both fell asleep at the same time during a transatlantic flight? How do we know about it? Not because the Daily Mail (whose lurid account I am loath to recommend) investigated it but because one – or probably both – of the pilots reported it.
The correct approach to safety is to create the kind of culture in which every employee is encouraged to report, examine, discuss and understand human error in the workplace. There needs to be arrangements – such as the CHIRP near miss reporting system in aviation – which allow professionals to share their own experiences of getting it wrong so that colleagues can learn before a disaster occurs. Every employee needs to be free to talk openly about her or his errors. That way everybody learns.
Academic studies, public enquiries, Serious Case Reviews, Parliamentary debates or public heart searching do not result in understanding why things go wrong in child protection. Indeed, given a prevailing culture in which the knee-jerk response is to blame a few bad apples and promise that it will never happen again, there is really very little learning going on at all. Until this situation is changed child protection will never get safer. We will simply lurch from one disaster to the next.
So here is my ‘grand design’. Let’s have no more grand designs, but let’s resolve to create the kind of culture in child protection where real, genuine and sustained learning about how to reduce and mitigate human error takes place in every workplace and in every team and every staffroom every day. Let’s create a culture where employees who are prepared to talk openly and frankly about their errors are not seen as targets for disciple or sanction, but are seen as primary sources of learning which can be the basis of much improved and safer services.