Preventing Hospital Errors by Howard C. Berkowitz

I have asked Howard Berkowitz to guest-blog on hospital errors.

Howard is in an unique position to write on this topic because he consults on
medical information systems for hospitals and also has been a long-time
patient. Over the years, he has taken an unusually detailed decision-making role
in his own care for heart diseases and diabetes which, he says, “has kept me
going, with bad heart genetics, at least 17 years more than my father.”

Howard also reports that “when no one else would coordinate my mother’s complex
cancer care, I did so…and I know what it is to preserve the semblance of
life, when only pain remains. Complex pain management is also one of my
interests; too few doctors know that pain should always be controllable.”

As a result of his own health problems and his parents’ illnesses, he has spent
more time in hospitals than anyone would ever want to endure. But unlike
most of us, he understood what was going on. Originally trained in microbiology
and biochemistry, Howard was doing independent research in antibiotic
resistance and working in a clinical laboratory while in high school. He
confesses that, for his 10th birthday, he asked his mother for a copy of the
Merck Index of Chemicals and Drugs. Subsequently, he built the first clinical
computer system for
Georgetown University Hospital,
developed virological systems for Electronucleonics’ “hot lab” and developed
cardiac care simulators and for the
George Washington University School of Medicine,
Office of Computer-Assisted Instruction. He also developed the first automated blood
bank laboratory tools for the Red Cross.


Full disclosure—he has two patents in process for hospital communications and staff management dedicated to keeping them informed, in real time, of patient needs.
His post follows below.

IF YOU’D LIKE TO COMMENT ON
THIS POST, PLEASE
CLICK HERE
TO E-MAIL MAGGIE WITH YOUR THOUGHTS.

In recent years, U.S. hospitals have been trying to reduce preventable errors that range from leaving
the stray instrument in a patient’s stomach to letting bedsores become oozing
ulcers.  But they haven’t been making as much progress as one might hope.
According to the Fourth
Annual Patient Safety in American Hospitals Study
published last April, the
data show that from 2003 to 2005 Medicare patients alone fell victim to
1.16 million preventable incidents, leading to 247,622 preventable deaths and
$8.6 billion in preventable costs (click
here
for a short analysis of the data).

This has led some payers—including Medicare—to ask “Why should we be picking
up the tab for the hospitals’ mistakes?”  Answering that question,
the Centers for Medicare and Medicaid Services (CMS) is threatening to stop
paying hospitals for costs incurred as a result of some of the most common and
preventable medical errors—a list that CMS labels: “Things That Should Never Happen.”

On the face of it, this sounds reasonable. Today, there are no financial
penalties for hospitals that report a large number of “adverse events.” The
hospital simply recodes the diagnosis and receives reimbursement for the
additional length of stay and treatment required to correct for the mishap. But
let me suggest that, rather than penalizing hospitals for each errors, CMS
might learn something from the U.S. aviation industry—as well as from Germany’s
healthcare system.

Begin with our aviation industry. Modern aviation has a culture that
formalizes the recognition and avoidance of errors. Pilots don’t try to cover
up their mistakes because the Federal Aviation Administration has a mandatory
program that requires reporting errors on a no-fault basis. In other words,
under Crew Resource Management (CRM), pilots who barely avoid mid-air
collisions are not penalized if they promptly disclose all the circumstances.
Safety specialists will then examine what failed and spread the knowledge
throughout the industry. (In most cases, aviation catastrophes, like many
medical catastrophes, come, not from a single error, but from a sequence of
them. If any of the errors had been prevented, the end result might have been
avoided.) In the US medical system, by contrast, physicians are reluctant to admit to a slip-up, in
part because they fear a malpractice suit.

Not long ago, when I was examining the design of some hospital information
systems that track “workflow” (or “who did what when”), I realized that in
cases where several patients were showing signs of a hospital-acquired
infection, our system logs could be used to backtrack, and identify the places,
staff, and equipment contacted by all of the patients. By looking at these
common experiences, investigators could identify the source of the infection.
Once known, infection control specialists could stop further infections. But
when the marketing staff of the company I was working for suggested the idea to
several hospitals, they learned that the hospitals were worried that the report
would be discoverable by a malpractice attorney acting for any of the patients.
So the hospitals turned down the idea. In other words, they would rather not
know where their hospital’s infections originate—because they don’t want other
people to know.

But fear of litigation is not the only obstacle to tracking and discussing
errors. There also is the unfortunate fact that the culture of medicine still
insists that doctors are supposed to be infallible. Put it this way: pilots use
checklists every time they fly, but I encounter a lot of physicians who don’t
want to check a reference book in my presence—even though I tell them that it
reassures me to see them look something up.

The notion that the chief surgeon must be all-knowing intimidates other
doctors and nurses, which explains why they don’t speak up, even as they
witness a catastrophe unfold. Here again, the aviation industry’s Crew Resource
Management (CRM) offers a useful counter-example. Aviation safety people
encourage other crew member to speak out when they see a potential safety
hazard. CRM failed when Air Florida 90 hit a highway bridge and then crashed into the Potomac.
The voice recorder shows the copilot suspected they didn’t have adequate
takeoff speed. Had he been more assertive, the captain might have aborted the
takeoff and a lot of people might not have died.

By contrast, United Flight 232 made a controlled crash landing in Sioux City, Iowa, after a total
hydraulic failure. Other test pilots, knowing what the real crew faced, tried
the situation in a simulator, and almost all crashed fatally in a very short
time. The real crew essentially kept an unflyable DC-10 in the air for around
45 minutes. 111 people died and 185 lived, but it was amazing anyone lived. The
crew distributed workload so while one pilot was physically flying and focused
on that, others were planning the next step. Anyone with a good idea voiced it
immediately.

Hospitals need to adopt CRM’s code. Just one example: in an operating room,
any member of the team that has the slightest suspicion that the doctor is
operating on the wrong limb should speak out loudly. But that doesn’t always
happen.

Finally, hospitals hide their mistakes because, in our market-based system,
they worry about losing market share if they begin admitting to botched
operations. A five-country
survey of hospitals
in the U.S., the UK, Australia, Canada and New Zealand
published in Health Affairs in 2004 revealed that hospitals in the U.S. and
Australia were most opposed to public disclosure of medical error, infection,
and mortality rates. It is probably not a coincidence that these were the two
countries where loss of patients to competition was a major concern—along with
malpractice costs.

In my next post, I’ll talk more about what is wrong with Medicare’s proposed
solution—and what we might learn from Germany’s healthcare system.