The problem with hospital rankings is that they attempt to be comprehensive, i.e. multi-dimensional. The weights apportioned to any given criterion are necessarily arbitrary. The most informative aspects of the US News annual rankings of hospitals are the individual categories of data, e.g. specific outcome measures and structural features of hospitals. These are theoretically important to patients trying to decide which hospital to go to.
What I am most interested in as a future medical professional, rather than a patient, is less about the patient safety features and more about how well specialists regard each hospital when thinking about where to send their sickest patients. This is the US News reputation survey, wherein a sample of specialists geographically distributed around the country (about 100 per specialty) name five hospitals they would send their sickest patients.
To make available a ranking of hospitals by the reputation survey results alone, I extracted the relevant numbers from all the US News honor hospitals from 2013-2014. I had to arbitrarily choose some weighting system in order to arrive at one metric to rank hospitals by. US News, in coming up with its Honor Roll, weighs each specialty the same. Intuitively, I don’t think it makes sense to give rheumatology the same weight as GI and general surgery. I chose to weight each specialty by the number of certificates that the American Board of Medical Specialties conferred in the last ten years. Basing this weight on board certifications is appropriate because US News surveys only board-certified physicians. They map certain sub-specialties to their US News specialty category, shown below. This kind of weighting approximates the kind of result that you would get if you randomly surveyed all physicians eligible to take this survey rather than stratifying it and making the sample size in each group equal. To put in more approximate terms, if you asked a random doctor at your local teaching hospital what the best hospitals were, and you did this at several other hospitals all around the country, which would the most common answers? Of course, this gives an advantage to the hospitals that are better at the specialties that have a ton of physicians, e.g. cardiology, over the small specialties, e.g. urology.
I also had to arbitrarily choose a lower limit of where to stop this ranking. US News makes the lower cutoff of its Honor Roll be a national ranking in six or more specialties. I chose to include all hospitals that outperformed (in this reputation ranking) the best-performing specialty-focused hospital. The reason I did this was due to feeling that this ranking should be a reflection of the best comprehensive adult hospitals. In this case, it was Memorial Sloan-Kettering, a specialized oncology hospital that was cited by 62% of specialists surveyed, earning it first place in the overall cancer ranking.
I then took the simple average (arithmetic mean) of the size-weighted specialty ratings. This is also an arbitrary choice. US News log-transforms their reputation data in order to minimize the advantage that a small number of elite hospitals have, in essence so that the reputation rating does not overshadow the outcome and structural measures in the overall ranking. I didn’t do this so that the reputation ranking can better reflect who the clear reputational leaders are.
Here are the results:
If one applies some univariate clustering, you can see that the ranking would be in this order:
2) Cleveland Clinic.
4) Massachusetts General
5) Brigham and Women’s, NewYork-Presbyterian, and UCLA
8) Duke, UCSF, Penn, UPMC, Barnes-Jewish
How does this differ from the US News Honor Roll? Notably, Northwestern (#6) is not on the top 12 by reputation, and Barnes-Jewish (#15) is in the top 12.