Introduction: Below a post by Paul Levy, the former President and CEO of Beth Israel Deaconess Medical Center in Boston. For the past five years he kept an online journal, Running a Hospital. He now writes as an advocate for patient-centered care, eliminating preventable harm, transparency of clinical outcomes, and front-line driven process improvement at one of my favorite blogs: Not Running a Hospital.
Levy’s post originally appeared on The Health Care Blog (THCB).
I should add that, as a journalist, I have watched lists like this one being compiled at various magazines: “The Best Colleges in the U.S.” “New York’s Best Doctors,” “The Best Motels in America”. . .. Who puts them together? Young journalists who know no more than the rest of us about which universities, hospitals, or motels offer a better education, safer surgery –or a nicer swimming pool. (I recall reporting a “best motels” piece for Money Magazine many years ago. I didn’t visit the motels. I talked to their owners on the phone.)
These are not investigative pieces. This goal is not to warn consumers; the goal is to advertise.As for the physicians surveyed, Levy is not blaming them for offering opinions rather than in-depth information about outcomes and patient safety. Most hospitals don’t make hard data about medical errors or infection rates available. In fact, many hospitals don’t keep detailed data about medical mistakes. They may well count the number of “adverse events,” but they don’t discuss and analyze them, even internally. Hospital CEOs have other priorities. This, I think, takes us to the crux of the problem. (See my companion post below)
It has been almost four years since I commented on the annual hospital ranking prepared by US News and World Report. I have to confess now that I was relatively gentle on the magazine back then. After all, when you run a hospital, there is little be gained by critiquing someone who publishes a ranking that is read by millions. But now it is time to take off the gloves. All I can say is, are you guys serious? Let’s look at the methodology used for the 2011-12 rankings:
In 12 of the 16 [specialty] areas, whether and how high a hospital is ranked depended largely on hard data, much of which comes from the federal government. Many categories of data went into the rankings. Some are self-evident, such as death rates. Others, such as the number of patients and the balance of nurses and patients, are less obvious. A survey of physicians, who are asked to name hospitals they consider tops in their specialty, produces a reputation score that is also factored in.
Here are the details:
Survival score (32.5 percent). A hospital’s success at keeping patients alive was judged by comparing the number of Medicare inpatients with certain conditions who died within 30 days of admission in 2007, 2008, and 2009 with the number expected to die given the severity of illness. Hospitals were scored from 1 to 10, with 10 indicating the highest survival rate relative to other hospitals and 1 the lowest rate. Medicare Severity Grouper, a software program from 3M Health Information Systems used by many researchers in the field, made adjustments to take each patient’s condition into account.
Patient safety score (5 percent). Harmful blunders occur at every hospital; this score reflects how hard a hospital works to prevent six of the most egregious types. A 3 puts a hospital among the 25 percent of those that were best in this regard, a 2 in the middle 50 percent, and a 1 in the lowest 25 percent. Examples of the six kinds of medical episodes factored in are deaths of patients whose conditions should not have put them at significant risk and surgical incisions that reopen.
Reputation (32.5 percent). Each year, 200 physicians per specialty are randomly selected and asked to list hospitals they consider to be the best in their specialty for complex or difficult cases. A hospital’s reputational score is based on the total percentage of specialists in 2009, 2010, and 2011 who named the hospital. This year some physicians were asked to list up to five hospitals, the rest to list up to 10.
Other care-related indicators (30 percent). These include nurse staffing, technology, and other measures related to qualityof care. The American Hospital Association’s 2009 survey of all hospitals in the nation was the main source.
Let’s see how this pans out for one specialty, pulmonology. We see that the number 1 and 2 ranked hospitals have great reputations but the lowest score for patient safety. The first hospital with “superior” safety rankings doesn’t appear until number 21.
The reputational data is opaque, as it has to be. With great respect for the 200 pulmonologists who were surveyed, how much current data have they seen about the outcomes achieved by hundreds of hospitals and thousands of doctors around the country. Answer: None. Why? Because there is no current data published on such outcomes. Likewise, there is no current data published about hospital related infections, falls, medication errors and other matters that could affect the treatment of a pulmonary patient, even if the pulmonologists are top-notch.
So, the reputational survey is likely to based on the following type of “information”:
Oh, I like Dr. Smith at ABC hospital. We were in residency together 25 years ago. He was a great guy. I still remember that amazing Christmas party in 1986.
That Dr. Jones is at XYZ hospital is terrific. I heard him give a paper at the last meeting of the ATS (or ACCP, or AABIP.) His Powerpoint presentation about his clinical successes (or research with mouse models) was gripping.
Dr. Pebble was trained by Dr. Stone, one of the best in the business in his day (40 years ago.) That’s good enough for me.
I sent a really sick patient to Dr. Good at RST Hospital. He saved her life. It was a very tough case, and he deserves a lot of credit.
US News needs to stop relying on unsupported and unsupportable reputation, often influenced by anecdote, personal relationships and self-serving public appearances, and work on real — and more recent — data. Maybe that will also cause hospitals to be more willing to report their data so they can be named to the “Honor Roll.” As it is, you are better off keeping things opaque to protect your reputation.
I think it is time to acknowledge that this ranking offers very little in the way of valuable information. It is mainly a vehicle for advertisements from the pharmaceutical industry, who know that this issue of the magazine gets a lot of attention and high circulation. As you flip through to each specialty, you are blasted with ads for drugs related to syndromes within that specialty. Here’s the top part of the pulmonology page
Then, if you click through to “find resources about” a particular disease, you do get some nice content information, but you get sprayed with even more ads.
There would be no market for this magazine survey if the government or insurance companies did their job and displayed real-time clinical outcome data. But those with the reputational advantage do not want that to happen. And those who profit from the lack of data also have nothing to gain by a more open presentation of the actual record and qualifications of hospitals and doctors in each specialty.