Introduction: Below a post by Paul Levy, the former President and CEO of Beth Israel Deaconess Medical Center in Boston. For the past five years he kept an online journal, Running a Hospital. He now writes as an advocate for patient-centered care, eliminating preventable harm, transparency of clinical outcomes, and front-line driven process improvement at one of my favorite blogs: Not Running a Hospital.
Levy’s post originally appeared on The Health Care Blog (THCB).
I should add that, as a journalist, I have watched lists like this one being compiled at various magazines: “The Best Colleges in the U.S.” “New York’s Best Doctors,” “The Best Motels in America”. . .. Who puts them together? Young journalists who know no more than the rest of us about which universities, hospitals, or motels offer a better education, safer surgery –or a nicer swimming pool. (I recall reporting a “best motels” piece for Money Magazine many years ago. I didn’t visit the motels. I talked to their owners on the phone.)
These are not investigative pieces. This goal is not to warn consumers; the goal is to advertise.As for the physicians surveyed, Levy is not blaming them for offering opinions rather than in-depth information about outcomes and patient safety. Most hospitals don’t make hard data about medical errors or infection rates available. In fact, many hospitals don’t keep detailed data about medical mistakes. They may well count the number of “adverse events,” but they don’t discuss and analyze them, even internally. Hospital CEOs have other priorities. This, I think, takes us to the crux of the problem. (See my companion post below)
It has been almost four years since I commented on the annual hospital ranking prepared by US News and World Report. I have to confess now that I was relatively gentle on the magazine back then. After all, when you run a hospital, there is little be gained by critiquing someone who publishes a ranking that is read by millions. But now it is time to take off the gloves. All I can say is, are you guys serious? Let’s look at the methodology used for the 2011-12 rankings:
In 12 of the 16 [specialty] areas, whether and how high a hospital is ranked depended largely on hard data, much of which comes from the federal government. Many categories of data went into the rankings. Some are self-evident, such as death rates. Others, such as the number of patients and the balance of nurses and patients, are less obvious. A survey of physicians, who are asked to name hospitals they consider tops in their specialty, produces a reputation score that is also factored in.