My Idea to Improve Doximity Rankings

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

gaspasser127

Full Member
7+ Year Member
Joined
Apr 19, 2016
Messages
348
Reaction score
303
Sooo I know the topic of doximity rankings (especially based off reputation) has been beaten to death but I was thinking about what could make the rankings more useful. What do you guys think? (I'm copy pasting a part of my e-mail to them)

____

After reading about the methodology for computing the reputation rankings, it was interesting to find that reputation is based on the responses by all survey-eligible physicians within that field. Essentially, the opinion of a physician who graduated residency 25 years ago and has been practicing in a private practice since then carries the same weight as a program director or chair of a department, who may have more up to date knowledge about the relative strength of residency programs.

Because the people who use Doximity Residency Navigator are medical students who are currently applying for residency (or who will be applying very soon), I believe the most useful information for the "reputation rankings" would be from "in-the-know" program directors and chairs of departments. As such, I have a few suggestions that I hope will be given consideration.

1. While all survey-eligible applicants may fill out reviews for their residency program, "reputation rankings" should be submitted only by program directors and chairpersons, based on the assumption that they have the most up-to-date knowledge about residency programs.

2. Instead of ranking all programs in order (when ranking by reputation), the rankings should be broken up into quartiles and each program within a quartile should be listed in alphabetical order.

Because no truly objective way of ranking a residency program based on reputation exists, the utility of "reputation rankings" on Doximity is that it gives you a relative idea of how strong a program is. There may not be much difference in reputation between a program ranked #2 vs #10 based on reputation, but there may certainly be a significant difference between #2 and #60. As such, providing the rankings based on quartile (or a 4 level tier system) and then listing the programs in alphabetical order within the tiers, will allow medical students to evaluate all programs within that tier without stressing about #2 vs. #5. I believe residency programs will find this to be beneficial as well.

---------

Thoughts?

Members don't see this ad.
 
Sooo I know the topic of doximity rankings (especially based off reputation) has been beaten to death but I was thinking about what could make the rankings more useful. What do you guys think? (I'm copy pasting a part of my e-mail to them)

____

After reading about the methodology for computing the reputation rankings, it was interesting to find that reputation is based on the responses by all survey-eligible physicians within that field. Essentially, the opinion of a physician who graduated residency 25 years ago and has been practicing in a private practice since then carries the same weight as a program director or chair of a department, who may have more up to date knowledge about the relative strength of residency programs.

Because the people who use Doximity Residency Navigator are medical students who are currently applying for residency (or who will be applying very soon), I believe the most useful information for the "reputation rankings" would be from "in-the-know" program directors and chairs of departments. As such, I have a few suggestions that I hope will be given consideration.

1. While all survey-eligible applicants may fill out reviews for their residency program, "reputation rankings" should be submitted only by program directors and chairpersons, based on the assumption that they have the most up-to-date knowledge about residency programs.

2. Instead of ranking all programs in order (when ranking by reputation), the rankings should be broken up into quartiles and each program within a quartile should be listed in alphabetical order.

Because no truly objective way of ranking a residency program based on reputation exists, the utility of "reputation rankings" on Doximity is that it gives you a relative idea of how strong a program is. There may not be much difference in reputation between a program ranked #2 vs #10 based on reputation, but there may certainly be a significant difference between #2 and #60. As such, providing the rankings based on quartile (or a 4 level tier system) and then listing the programs in alphabetical order within the tiers, will allow medical students to evaluate all programs within that tier without stressing about #2 vs. #5. I believe residency programs will find this to be beneficial as well.

---------

Thoughts?

A few random thoughts. While more current residency grads may be able to speak to their opinion of their own training, the docs that have been out longer are more likely to be the ones doing the hiring and their opinion of a program might count for more when job searching. Also, separating into tiers, while potentially a good idea, could definitely give the appearance of a large gap between programs when in fact there isn't one. I mean the last school in Tier 1 vs the first school in Tier 2 are basically identical, yet anyone that looked would assume a huge difference exists.

I do not belong to doximity despite the hundreds of requests I've gotten to join so I don't know what their rankings actually look like. For reputation rank, it'd be far more appropriate IMHO to include the exact number (whether that's a mean or median or whatever) for each program along with rank. Then you could see that perhaps the #8 program was as close to #30 as #30 was to #80 or what not. The actual data would be more useful than just the rank order.
 
While fair points, why are you trying to improve upon something that is impossible? These rankings benefit no one except doximity, which probably created them to generate traffic and revenue, basically by preying on neurotic medical students. At best, they provide residents at highly ranked programs a false sense of superiority, and at worst, students are making poor personal decisions to try to match at a place at the top of the list, rather than a program where they may be better suited.

Understand that there's an end goal - eventually you apply for a job and start a real life, and there are no rankings for jobs. You have to talk to people and make your own determinations. No job is perfect, and you eventually weigh compensation, lifestyle, and location. No residency is perfect either, and you have to weigh case mix, program size, and your future goals to figure out where you will be happy and receive the type of training and opportunities you want. Instead of trying to change the methodology, just ignore the rankings - they are simply not meaningful or useful.
 
  • Like
Reactions: 1 users
Rankings are another stupid way of trying to separate physicians from each other. I'm not saying that there aren't differences in programs. Some are definitely better than others. But can you really quantify the differences? I doubt it.
 
  • Like
Reactions: 1 user
Top