Tuesday, November 15, 2005

Governance in the Health Industry

Qualityworld

Keep the patients safe

Hardly a month goes by without media reports of another avoidable fatality in the NHS. With medication error accounting for an estimated 25 per cent of worldwide inadvertent medical harm, it would be easy to point the finger of blame. But, argues Sir Liam Donaldson, England's chief medical officer, often it is the system which is at fault, not the individual

Why is it that there have been 23 incidents reported in medical literature since the mid 1980s, in which patients receiving maintenance treatment for leukaemia (or similar conditions), have not come out of hospital alive or were very seriously injured. The relatively simple reason is that an injection intended to be put into a vein is mixed up with one intended for intrathecal (spinal) injection. The result: paralysis and usually death.
But these incidents are extremely rare. So much so, that when this was used as an early example of the sort of catastrophe that can happen in healthcare, a typical reaction was: 'Well, these things can't be avoided. Accidents will rarely happen and they can't be prevented, so why bother to concentrate your attention on them?'
In one of these cases, instead of a traditional Royal College or health service investigator being asked to come and look at it, Professor Brian Toft, who was from outside the health field and experienced at investigating other sorts of accidents and disasters (such as in the rail industry), was brought in to look into what had gone wrong. The findings of the published report were that it was not a single event that had led to this particular tragedy but approximately 40 failures had occurred along the way.
Human error per se does not usually kill patients, but human error in a weak system can injure or even kill. A weak safety culture, weak operational practices, weaknesses in the presence of protocols and training, weaknesses in communication and serious weaknesses in the packaging and design of drugs. In short, comprehensive systems weaknesses greatly increase the risk of harm coming to a patient.
A safe system
In healthcare, until about ten years ago, there was relatively little attention given to safety as a concept. It was believed that if you produced well-trained and educated practitioners, in a modern environment, then safety and to a certain extent quality, would take care of themselves. The maxim 'first do no harm' is something that people adhered to, but it was not explicit. And it certainly was not conceptualised within a framework of patient safety, in the way that safety had been an issue in other industries for many years.
There have been a few commentators over the years. Dr Lucian Leape, one of the American physicians who has written and researched a lot on patient safety said: 'Human beings make mistakes because the systems, tasks and processes they work in are poorly designed'. For most health service staff this is a new concept. They see themselves as individuals exercising judgement, taking decisions. They do not think about the system that's built around them, which also plays its part in determining the quality and safety of healthcare.
There was also relatively little research on patient safety. But over the past five years numerous papers have appeared in peer reviewed journals on patient safety and medical error. There have been three major studies, all using broadly similar methodology - reviewing medical records. They have produced estimates of the proportion of in-patient episodes of care which result in inadvertent harm to patients, ranging from about four per cent to 17 per cent.
The near-miss
When considering safety, the concept of the near-miss, originating from a study done in the 1940s by a factory accident investigator called Heinrich, must not be overlooked. Heinrich looked at factory accidents and found the following ratio applied: for every serious injury, there were about 300 incidents where something nearly went wrong but did not. Either somebody spotted the hazard and averted it or, by luck, the accident did not happen.
We know a little bit about the major injuries. We do collect some data on major injuries and traditionally, we try to learn from them. But the near-miss situation, we never learn from even though it is the free lesson - the opportunity to learn something that if applied to the system as a whole, could considerably improve accident rates.
An example of this was the diabetic schoolboy who was about to be injected with a dose of insulin by a nurse. His mother noticed that the syringe contained ten times the correct dose and questioned the nurse, who replied she was giving the boy seven units of insulin. The mother correctly told the nurse there was actually 70 units in the syringe and was disturbed that the nurse could not tell the difference between one unit and ten units.
Unusually, this incident was reported in the newspaper, though the media treated it as a full-blown accident rather than a near-miss. This tells us two things. First, that the near-miss is alive and well as a source of learning, if we can only capture it. Second, patients have a very important part to play in helping us to recognise error and improve safety.
After the Hatfield train disaster, where four people died, public attention and action was firmly focused on trying to improve rail safety. In that year the same number of people died from infusion pump errors (a way of delivering drugs automatically to a dosage regime that is predetermined). But nobody batted an eyelid because safety in healthcare is not perceived in the same way as it is in other fields.
Problems occur when there are a series of isolated events through which there are no connections. For example an anaesthetist might reach to get a particular drug but pick the wrong one which was sitting on the shelf next to the correct one. This is highly unlikely as anaesthetists are a very safe group of practitioners but the systems in place do not make it easy for them. Every day they are put in a high-risk situation and expected to behave safely.
It is important to create a culture that is able to allow the neutral reporting of error and its analysis. Traditionally, the healthcare environment is one where the public, media and sometimes the management perspective is one of blame and retribution. Moving away from that environment does not mean that nobody can ever be held to account. People who are careless, neglectful and who are guilty of wilful misconduct do need to be held to account. But many more of the things that go wrong in healthcare are due to honest failure.
With honest failure, strategies have to develop which enable people to feel free to report their mistakes without fear of blame and retribution because ultimately, such a culture drives good quality underground.
There was a case in which a three year-old girl died because of a mix up of the anaesthetic pipes which were attached to her machine, so that instead of being given oxygen to resuscitate her she was given nitrous oxide by mistake. This was a terrible tragedy for the child's family, and also for the hospital and the staff. Particularly because it could have been avoided had the machine had an alarm or protector in place which most anaesthetic machines have. At the inquest into the child's death, the anaesthetist concerned was in tears and gave an emotional speech in which he apologised directly to the girl's family. At the end of the inquest the girl's father and the anaesthetist hugged each other and the father said he forgave him. The father said he believed there was no intention of deliberate harm and that the doctor had made a mistake.
So somewhere along the line we need to get to grips with some of these fundamental points about blame, retribution, reconciliation and learning. Those are the issues that are the major challenges for us in safety, as well as managing to gather the data. When something goes seriously wrong or where poor quality prevails, there is usually a dysfunctional clinical team at the bottom of it. Cliques and factions, people that cannot work together, autocratic leadership - those are the sorts of unhealthy things about clinical practice that we do not tend to think about when we are concentrating on the technical aspects of care.
The orange wire test
Comprehensive research is needed to find out the weaknesses in the systems but the analysis needs to be different. There needs to be a root cause analysis. I have an imaginary picture in my mind that in every Boeing 757 there is an orange coloured wire. If that orange coloured wire broke or became frayed and it was spotted by an aircraft engineer on the tarmac in Dar es Salaam, there is a reasonably good chance that within four or five days every Boeing 757 in the world would have been inspected, and if there was a problem with the orange coloured wire it would be corrected. But if in Cornwall or Rotherham tomorrow there was a faulty procedure, which had the capacity to kill somebody, would that metaphorical broken orange coloured wire be transmitted through the NHS within 48 hours? I think you know the answer to that question and that is a fundamental challenge for everyone.
The report 'An organisation with a memory' sets out many of these concepts in detail, together with the proposals for change which were spelled out in more detail in the report, 'Building a safer NHS for patients'. A National Patient Safety Agency has been established, with a role to collect and analyse information and help to ensure that organisational learning takes place. Questions need to be asked of all healthcare organisations: would they be able to demonstrate indisputably that their service is becoming safer for patients year in and year out? If something serious happened, would the culture of the organisation be to cover it up or to learn from it?
But the final word should go to Professor James Reason (one of the world's leading academics in the field of understanding error): 'We can't afford to duck the challenge of patient safety. We can't afford to allow it not to become a core component of our quality strategy because either we manage human error or human error will manage us'
Biography
In 1998, Sir Liam Donaldson became the 15th chief medical officer for England - the government's senior medical adviser. He is responsible for policy development and implementation in improving the population's health, protecting the public's health and, with the NHS, leading in the implementation of clinical governance, quality and patient safety. Sir Liam started his medical career in surgery and later trained in public health, holding senior positions in public health medicine and management.