Tuesday, 20 November 2012

Organisational and Professional Learning

Michael Gove outlines two failures in child protection in the UK (numbers 7 and 8 in my list) concerning learning. He says:
  • We are not transparent about the mistakes that were made when things go wrong
  • We do not learn properly from what went wrong to improve matters in the future
A failure to learn is fundamental. Organisations which do not learn are doomed to fail. If we cannot learn how to protect abused and neglected children better, then we should despair.

Gove does not tell us what he thinks inhibits learning, but by saying that we are not transparent about mistakes he drops a heavy hint. Lack of transparency is usually due to a surfeit of fear. If people fear that their errors will have dire consequences for them personally they will not confess them, they will try to hide them. Their errors will remain secret. No corrections will be made. Errors will continue to happen. Safety will be compromised. Children will continue to die.

Back in the 1980s the airlines in Europe and the USA realised this. They realised that a jet engine technician who had lost a spanner on the job was a valuable asset if s/he confessed to the error, but a dire liability if s/he didn't. What was better, to threaten erring technicians with severe punishment, and to risk losing airliners, or to deal with most errors non-punitively and promote openness? It was a no-brainer.

Of course in practice, it may not be quite that simple, but the basic principal holds - unjust blame inhibits safety.

Swedish academic, Professor Sidney Dekker, has written extensively on these issues [1]. His book, entitled Just Culture: balancing safety and accountability, is essential reading for anyone working in a safety critical industry, like child protection. It deals with how to promote openness while not tolerating everything. That takes a lot of thought but it is relatively easy to see how it might work out. A 'just culture' is not 'a no blame culture'. Sabotage, recklessness and seeking personal gain at the expense of realising organisational and professional goals (such as the best interests of a child) should still be disciplinary matters. But errors which are committed in good faith, no matter how dire the consequences, should not attract punishment.


To achieve transparency the people at the top have to work very hard to promulgate a 'just culture'. It is easy to say that genuine errors will not be punished, but a great deal harder to do when the tabloid press is baying for blood. Politicians, civil servants and senior children's services managers need to work hard at developing a just culture and demonstrating that they embrace it. They need to have in place clear agreements not to wilt at the first sign of pressure. Promises must be kept.


Having a just culture is the start, not the end, of developing a learning culture.  We need to be clear about the nature and scope of learning. 

It is a fundamental mistake to prescribe learning too closely. We can learn to hit targets, achieve performance objectives, implement procedures or pass Ofsted inspections - but none of that matters if we are still not doing the right things to protect children. Real learning is about a deeper understanding of what we are dealing with (child abuse and neglect) and ever expanding our knowledge of how to deal with it more effectively. 
 
Chris Argyris and Donald Schön [2], who were professors at Harvard and MIT respectively, call this second type of learning 'double-loop' learning. They define double loop learning as follows:
When the error detected and corrected permits the organization to carry on its present policies or achieve its presents objectives, then that error-and-correction process is single-loop learning. Single-loop learning is like a thermostat that learns when it is too hot or too cold and turns the heat on or off. The thermostat can perform this task because it can receive information (the temperature of the room) and take corrective action. Double-loop learning occurs when error is detected and corrected in ways that involve the modification of an organization’s underlying norms, policies and objectives. [3]
Double-loop learning has a consequence which may be unwelcome to some policy-makers and managers: they have to give up some control of the service to the learning process which is often mediated, not by those who direct, but by those that do the work. Workplace learning in airline safety, for example, has uncovered commercial pressures, long hours and tiredness, the careless exercise of authority and the arrogance of command as key human factors that increase risk. Some airline executives may not like this type of conclusion but they have been driven to accept it.

In child protection double loop learning might result in challenging some of the assumptions of how services are currently designed and delivered. Are top-down bureaucracies suitable organisations for the task? To what extent should government, or the courts, seek to influence professional practice? Can services be delivered by over-worked and underpaid individuals? How can children and young people have more say and more control? To what extent should the civil and criminal law play a part? How far can professional boundaries be crossed?

Another fundamental mistake is to assume that there are just a few ways in which learning can come about. Recent discussions in England have been excessively influenced by the identification of the Serious Case Review (SCR) as the main tool of learning. But the SCR is probably not a very effective tool. It is cumbersome, slow and may obscure vital information. It is hard to circulate and publish the learning from an SCR. It takes a long-time to prepare an SCR. Despite its ubiquitous presence since 1991 the SCR has not resulted in reducing the same kind of mistakes happening again and again [4]. 

The primary need is for what I call 'workplace' learning. That is learning that takes place day-in and day-out at the point at which the service is delivered, usually conducted and mediated by front-line staff. The aim should be that tomorrow all those delivering the service are better placed to do so - if only a little bit - than they were today. Experience from manufacturing industry is that the cumulative effects of regular and frequent workplace learning can be dramatic [5]. I am convinced that similar dramatic results can be achieved in child protection, if only we embrace workplace learning seriously.

There are three tools of suitable for workplace learning which seem to me to be relatively easy to understand and fairly straightforward to implement initially, although the trick comes in sustaining them in the longer term. They are:
  • Debriefing
  • Kaizen
  • Critical Incident Reports
Debriefing is the simple idea that every significant piece of work concludes with a short review by all those who have been involved in it. The idea is to identify what went well and what went badly. How can the work be done better next time? Do any important issues need to be escalated for management to address? You will find that the flight crew of your holiday airliner carry out a debriefing at the end of every flight.The other side of the coin, of course, is briefing which takes places before significant pieces of work and in which the messages from previous debriefings are taken forward.

Kaizen is the Japanese idea of 'continuous improvement'. Now common in manufacturing industries across the world Kaizen places a responsibility on all employees, not just for doing the work but for improving the way in which the work is done. The emphasis is on identifying small, cumulative changes which over time amount to significant improvements in service or manufacturing processes.  Employees who recognise opportunities for improvement in a process communicate their ideas to managers, who investigate the suggestion and implement it, if it is feasible. It is vital that management take all suggestions for improvement seriously and respect the fact that front line workers will usually have a better understanding of business and professional processes than they do.

Critical Incident Reporting dates from the 1950s when it was developed to improve safety in aviation. It has subsequently extended into other transport industries and into medicine. The idea is that when a near-miss (a 'critical incident') occurs it is reported on an anonymous basis to an independent reviewer for analysis and publication. Aggregate summaries of critical incidents are provided regularly to inform safety management and to identify weaknesses in organisational defences.  As Professor James Reason puts it: ‘Without a detailed analysis of mishaps, incidents, near misses, and “free lessons,” we have no way of uncovering recurrent error traps or of knowing where the “edge” is until we fall over it.’ [6]

I believe the introduction of these three techniques, with accompanying culture changes, would make a significant difference to learning in child protection. They would result in streams of information flowing from the bottom-up, instead of the top-down flows which have traditionally predominated. They would provide the raw material of learning - relevant data - and they would involve all those involved in providing services the opportunity to contribute constructively to bringing about change and ultimately better, safer services.


[1] Dekker, S (2007) Just Culture: balancing safety and accountabilityAshgate
[2] Chris Argyris and Donald Schön (1974) Theory in practice: Increasing professional effectiveness, San Francisco: Jossey-Bass
[3]  Argyris, C., and Schön, D. (1978) Organizational learning: A theory of action perspective, Reading, Mass: Addison Wesley, pp 2-3
[4]  Care and Social Services Inspectorate Wales Annual Report 2008-2009 - see http://chrismillsblog.blogspot.co.uk/2009/12/serious-case-reviews-poor-tool-for.html
[5] Imai, Masaaki (1986). Kaizen: The Key to Japan's Competitive Success. New York: Random House. 
[6] Reason, J 'Human error: models and management'  British Medical Journal  volume 320 18 March 2000