This site is intended for health professionals only

Hard lessons and small victories in patient safety

This article is based on the Aventis Lecture given by Dr Henri R Manasse Jr, at the United Kingdom Clinical Pharmacy Association meeting on 17 May 2002, Manchester, UK

Reported by:
Laurence A Goldberg
FRPharmS
Consultant Pharmacist
E:[email protected]

Two weeks before Christmas 1995, a healthy, athletic seven-year-old boy went into a Florida hospital for elective surgery to remove a benign growth in his ear. Twenty minutes after the general anaesthetic took effect, the surgeon injected the boy’s ear with what was supposed to be lidocaine 1% with 1:100,000 epinephrine. The drug had been drawn into a syringe from a plastic cup labelled “lidocaine with epinephrine”, which a nurse had filled 90 minutes earlier in accordance with hospital procedure. Near the plastic cup sat a metal cup, meant to contain 1:1,000 epinephrine for swabbing on the wound to control bleeding as needed. That morning, the vial containing 1:1,000 epinephrine had mistakenly been poured into the cup labelled “lidocaine with epinephrine”, and 3ml of the solution, containing approximately 3mg of the powerful stimulant, was drawn into the surgeon’s syringe and used to infiltrate the boy’s ear.

Two minutes after the surgeon had injected the concentrated topical epinephrine, the boy’s blood pressure and heart rate began to climb rapidly. The head anaesthetist was summoned to assess the situation, but after a few minutes the boy’s vital signs stabilised. The surgical team decided to continue with the procedure. Ten minutes later the boy’s vital signs plummeted and he “flatlined”. The team administered CPR for an hour and 40 minutes, and finally his heartbeat returned. Comatose and on a ventilator, the boy was taken to the intensive care unit and later transferred to a larger hospital, where, the next day, his family decided to turn off the ventilator. The boy was declared brain dead.

Harrowing stories like this are all too familiar: the recent, tragic deaths of Wayne Jowett and Ritchie William in the UK, from the misadministration of vincristine during their chemotherapy treatments, have intensified public dialogue on this side of the Atlantic.

Imagine the shattered trust of those families in the doctors they had probably grown to know and like, in the healthcare systems that failed them, and in themselves for putting their children through such traumas. You may have read about, or even witnessed, the drawn-out legal battles as loved ones try to coax the truth from a system that is afraid to give it to them for fear of public damnation, financial liability and shame.

Impact of medical mistakes
What becomes of the health professionals, almost always good people, trying to do their best, now haunted by deadly mistakes? The head anaesthetist at the hospital where the young boy was operated on wept openly as he recounted the tragedy in an interview a year later. Earlier this year, a 32-year-old pharmacist in the San Francisco Bay area was found dead, an apparent suicide, with six 100mg fentanyl patches on his neck, chest and stomach, presumably despondent over the deaths of three patients from contaminated drugs compounded at his pharmacy.

In a study of the emotional impact of mistakes on physicians, half of the doctors interviewed listed fear among the emotions they felt after making a mistake in practice. It is no wonder: contemporary medical malpractice law as it is practised in the USA has created a climate among patients, doctors and institutions that is adversarial and opaque. When hospitals are counselled to cover up errors rather than honestly admit them, and doctors are admonished to keep silent with their patients, what surprise is it when juries award stiff penalties and families lambaste the system in the press?

The state of New Jersey recently reported that its malpractice premiums have risen 250% in the last three years, averaging over $940,000 per hospital in 2002. Some states are attempting to remedy this trend by establishing limits on the amount of the monetary award for medical malpractice damages. There is also an attempt to define a window of time for filing malpractice lawsuits.

Damages are not limited to malpractice suits alone. Seven years after Boston Globe reporter Betsy Lehman died from a massive overdose of cyclophosphamide at Boston’s respected Dana Farber Cancer Institute, repercussions still ring for those involved in the accident and the ensuing investigation. This March, a jury awarded $4.2 million for libel and defamation to a physician indirectly involved with the case, who had sued the Globe, one of its reporters, the Dana Farber Cancer Institute, and its former chief of staff after the Globe misreported her role in the accident. Since the accident, the hospital has adopted an abiding commitment to safety and system redesign. Yet, as this verdict reminds us, these tragedies carry cruel and far-reaching consequences.

Accident analysis
Since the early 1990s, much has been made of the causes of medical accidents and the comparisons between healthcare and other “high-hazard” industries. You are likely to be familiar with the concept of analysing accidents from a perspective of system flaws rather than personal failings – identifying and correcting the latent, or silent, errors within processes and organisations, rather than ferreting out “bad apples” and exhorting employees to be more careful.

Two prevalent approaches to accident analysis are Normal Accident Theory (NAT) and High Reliability Organisation Theory (HROT). These two theories share the core conviction that errors are inevitable in human thought and action; and so, therefore, are errors within the systems we design.

NAT takes a “pessimistic” view, stating that accidents are not only inevitable but are made more so by the complexities built into human systems. The common “Swiss-cheese” schematic illustrates NAT by depicting numerous layers of system defences, each peppered with holes, or flaws. When the layers interact so that the holes line up, error passes through unhindered and accidents result. While some criticism has been directed towards this concept, it is useful to view systems errors within an understanding of “accidents waiting to happen”.

HROT, while agreeing that human error is unavoidable, proposes that accidents are preventable when safety is upheld as the key organisational objective. US Navy aircraft carriers, nuclear power plants and air traffic control centres are examples of well-studied high-reliability organisations with common challenges facing them – managing complex, demanding technologies to avoid crippling failures, and maintaining the capacity to meet unforeseen periods of very high peak demand – and common characteristics – internal complexity, dynamism and capacity for intense interaction; capacity to accomplish exacting tasks under great time pressure; and near-perfect avoidance of catastrophic failures over several years.

Recent advances in accident theory and human factors engineering (HFE) have been applied broadly over the last several decades to other industries that pose risks to human life and health, notably commercial and military aviation and nuclear energy production.

HFE integrates knowledge about humans’ intrinsic abilities and limitations into the design of job functions, working conditions and machines – from shift lengths to lighting conditions, from console buttons to software screen layouts. Healthcare institutions are just beginning to embrace this knowledge field, and consequently, they should be accountable for employing the expertise of human factors experts in testing designs thoroughly with their own “sharp-end” personnel.

Putting theory into practice
Theories and “Swiss cheese” are all well and good, but can other industries like these, which rely on routine tasks and the fairly predictable laws of physics, really set meaningful precedents for healthcare, where every patient presents biological variation and every outcome is driven by microscopic processes and interactions that we can sometimes neither identify nor control?

The answer is clearly yes. Other high-hazard disciplines have demonstrated that appropriate applications of training, technology, protocols and continuous quality improvement can reduce accident rates to astonishingly low levels.

It is difficult to express something so broad as healthcare in “parts per million”, but consider the Harvard Medical Practice Study’s finding that 1% of hospital admissions result in patient injuries due to negligence. That is 10,000 defects per million opportunities. It is estimated that 50% of persons suffering clinical depression go undertreated or undiagnosed – there is still a long way to go.

Anaesthetists have paved the way in patient safety research and practice, recognising that such simple steps as standardising console dials and improving blood oxygen sensing mechanisms could prevent a great many patient injuries and fatalities. Three decades’ worth of concerted efforts have seen deaths associated with anaesthesia decrease from at least 25,000 per million to about 5 per million. Anaesthetists’ adoption of the Crew Resource Management (CRM) team-building method and simulator training used in aviation offers a model for collaborative therapy management teams everywhere. Rather than focusing strictly on the technical aspects of work, CRM recognises human performance limiters such as stress, fatigue and working environment, and uses simulations to prepare for emergencies.

However, we cannot achieve the sweeping changes that are needed by applying these principles piecemeal, one department at a time. Just as ideal patient care combines the efforts and expertise of diverse disciplines, so too does “systems thinking” require coordinated, consistent application throughout the whole of the practice site or health system.

Mandatory or voluntary reporting?
The topic of medical error reporting continues to be hotly debated. Fortunately, opposition to reporting is beginning to fade fast in American hospitals. You cannot argue with the fact that reporting provides data, and drives discovery, understanding and improvement. Meanwhile, debate bristles over whether reporting should be voluntary or mandatory, anonymous or identifiable.

Each US state has the authority to require its hospitals to report errors to a central state-based databank, in order to analyse data and maintain public accountability. Currently 16 states have mandatory programmes in place. Clearly, mandatory reporting has merit for study and prevention purposes. It is also vital for disclosing criminal activity or negligence among health workers. But it is important to remember that those states that have mandatory reporting laws passed them by providing protection for reported data from legal discovery – an important protection that is being considered for inclusion in Federal law. In American jurisprudence this protection is considered essential for meaningful and comprehensive near-miss and error reporting programmes.

However, mandatory reporting at the institutional level often goes hand in hand with punishment. Would you want to speak candidly about how a patient was killed or injured under your care, if all you could expect in return was being sued, losing your job, licence, credibility, and perhaps jeopardising your organisation’s accreditation? Better to blame some mysterious drug interaction or phantom allergy.

Voluntary reporting is driven by conscience and a desire to prevent errors from recurring. Voluntary reports tend to be more thorough, accurate and detailed than those submitted under mandatory systems, which often obscure embarrassing facts and contain only the level of detail needed to satisfy policy. The volume of reporting of near-misses and errors in the voluntary framework, however, continues to reflect gross underreporting.

Because cultures and attitudes are so varied and unique from site to site, every institution finds it a challenge to implement a uniform reporting system. Through enlightened leadership, many hospitals have implemented successful systems of voluntary, nonpunitive reporting, and have built communication linkages where staff can quickly and safely share information on errors and near-misses. Still, even when the spectre of blame is lifted, obstacles remain. Reporting systems lack standardisation – their form, function and user-friendliness still vary widely from place to place.

Terminology can be confusing too: you are busy, there are a thousand demands on your day, but this morning you witnessed a minor mishap that could have hurt a patient but did not, and you are trying to do the right thing by reporting it. Was it significant enough to merit reporting? If so, what was it – an error, accident, incident, adverse event, near- miss or near-hit?

Openness with patients
We have established that reporting an error to your employer or to some distant database can be fairly painless, but what about discussing errors with patients? Isn’t your greatest professional fear that someday you may not only be involved in an error, but you might have to admit it to the patient – or worse, to a grieving family?

The fact is that being open with patients is the right, and really the only thing to do. Due to the volume of medical information available online, the landscape of healthcare is changing to reveal patients, not physicians, as the new centre of gravity and source of decisions. A recent study found that, of 2,500 adult patients, 76% desired to be told all information about medical errors and did not wish for physicians to exercise “discretion” or selectively withhold information. Patients can be surprisingly forgiving when they’re approached with candour and respect.

The need for a national agency
The USA still lacks a federal agency to champion the cause by way of regulation and enforcement. Admirable efforts are underway, and much progress has been made, but it has been more bark than bite.

The UK is a step ahead now that the National Patient Safety Agency is in place. The UK’s National Health Service is to be commended on recognising patient safety as a matter demanding the government’s dedicated attention and commitment.

Resource
UK National Patient Safety Agency
Marble Arch Tower
55 Bryanston Street
London W1H 7AJ
UK
T:+44 (0)20 7868 2203
E:[email protected]
W:www.npsa.org.uk






Be in the know
Subscribe to Hospital Pharmacy Europe newsletter and magazine

x