In January, on a beautifully clear summer day, I was at
Milford Sound , the stunning fiord on the west coast of New Zealand’s south island.
As we sailed up the sound, the captain of our sightseeing
ship told us that Milford Sound airport is one of the busiest in New Zealand,
although it can only be used by small light aircraft. Every few minutes one
such passed overhead, making its winding passage up the sound and then swinging
sharply right between the towering sides of the fiord. Before the final
approach to the tiny runway a tight one hundred and eighty degree turn had to
be executed. Flying into Milford Sound is not for the faint-hearted!
Watching these small planes landing at Milford Sound, and
knowing that several hundred do so safely every day, made me reflect on why an
apparently unsafe form of travel – the airplane – has now become so safe. It is
not just better technology and more experienced people, but an attitude of mind
that creates a safety culture. Part of that is an insatiable curiosity about
what can go wrong and how it can be avoided. If you have to fly into Milford
Sound everyday you can’t afford to close your eyes and hope for the best. You
have to anticipate the worst and prepare for it.
All of which brings me to the issue of Critical Incident Reporting. Back in the 1940s an American air
force colonel called Flanagan [1] came up with an apparently simple idea. To
improve aviation safety, he argued, it was not good enough just to understand
what caused particular accidents.
Rather we need to know about situations in which accidents might have
occurred, but didn’t. We need data
about the errors that are made in normal practice, which do not result in a
fatal outcome and which often do not come to light. In short we need to study near misses.
Professor James Reason articulates the need for such an
approach rather well: ‘Without a detailed analysis of mishaps, incidents, near
misses, and “free lessons,” we have no way of uncovering recurrent error traps
or of knowing where the “edge” is until we fall over it.’ [2]
Critical Incident Reporting is one means by which we can
obtain data for studies of near misses. Professionals – pilots, ships’ officers,
train drivers, doctors, social workers, or indeed any workers whose tasks are
safety critical – are provided with a simple means of reporting a critical
incident or near miss. The reports are submitted confidentially and the results
aggregated and reported in such a way that nobody can tell who in particular
was involved. Then the original report is destroyed to ensure continuing
confidentiality. That way people will tell the truth and will report incidents
which otherwise may never have come to light.
Critical incident reporting has been long established in
Britain in aviation and shipping [3] and is strongly advocated in medicine,
particularly anaesthesia and intensive care [4] [5]. Back in 1990 I co-authored an article [6] on the
applicability of the technique to child protection social work. We argued that
reports into child abuse disasters reveal that factors contributing to the
death of a child commonly occur in normal practice. These include poor
communication, professional disagreements, vacillation, uncertainty in response
to aggressive families, pressure of work and burdens being inappropriately placed
on inexperienced staff. We concluded that it was only possible to understand
the causes and effects of such malfunctions by carefully documenting how they
occurred in normal practice.
If properly implemented the benefits of Critical Incident
Reporting are obvious and profound. Sadly it is all too easy to get the
implementation wrong. A crucial
mistake is to be careless about the arrangements to ensure confidentiality; if
people believe that they may be identified by a report they will not wish to
participate.
An example of poor implementation appears to have occurred
in the NHS in Scotland where copies of reports were apparently retained in a
filing system. Although heavily redacted these reports had to be made available
to the media under the Freedom of Information Act, raising the possibility that
individuals or particular units could be identified.
In contrast CHIRP, the aviation and maritime critical
incident system, ensures confidentiality by being an independent charity rather
than a branch of government or of a particular airline or shipping company. And
reports of incidents are quickly destroyed so that no-one can trace the identities
of those involved.
The CHIRP website promises:
“CHIRP always protects the identity of our reporters. We are a confidential programme and, as such, we only keep reporters’ personal details for as long as we need to keep in contact with them. When a report is closed off, all original correspondence is sent back to the reporter and all notes are shredded. The reporter’s personal information never gets input in to our database.”
I believe that a Critical Incident programme for child
protection could work well in Britain. It would need to be set up as an
independent charity, but could receive donations from public bodies as well as
from the general public. Independence and confidentiality are crucial but
should not be difficult to achieve. And it is likely that such an initiative
would not only result in greatly improved safety but might also result in
reduced costs as agencies learn more about how to avoid costly and unnecessary
errors.
End Notes
[1] Flanagan J. C. “The critical incident technique.”
Psychological Bulletin 1954; 51: 327–58
[2] Reason, J. “Human error: models and management.” British Medical Journal
2000;320:768-770 (18 March)
[4] Cooper JB, Newbower RS, Long CD, McPeek B. “Preventable
anesthesia mishaps: a study of human factors.” Anesthesiology 1978; 49: 399–406
[5] Mahajan R. P. “Critical incident reporting and learning”
British Journal of Anaesthesia 105 (1): 69–75 (2010)
[6] Mills, C. and Vine, P. “Critical Incident Reporting – an
Approach to Reviewing the Investigation and Management of Child Abuse.”
British Journal of Social Work (1990) 20, 215-220