.

  • Thread starter Thread starter deleted737701
  • Start date Start date
This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Bruh it's pointless to rank DO schools...match has very little to do with whatever DO school you attend. Where you match has more to do with how hard you work, connections you make/likability, research, Step score (now it's P/F), and how you do on rotations.
 
This is worthless. MSU, OSU, TCOM (not even on there) and OUHCOM have resources that dwarf all of the schools in your top 10 outside of PCOM.
 
Agree with the above. Using specialty match rate and academic match rate is pointless since so many personal factors come into play in those decisions. I know plenty of classmates who ranked community programs above university ones. Overall match rate/placement rate is worth taking into account, as well as resources as Anatomy mentioned above.
 
Match rate generally isn't a good way to rank programs because so many different factors go into what residency a student pursues and where they pursue it. There are 260+ Step scorers going into family medicine while others may heavily base residency location over academic "prestige". The way programs should be evaluated are based on the resources they offer to their students. State DO schools are a league ahead of all the others due to the fact they have GME's and research. Networking, pursuing research, and solidifying letters of recommendations in the field you are interested in the biggest way to set yourself apart from other applicants when scores are all equivalent. State DO schools have the ability to do this for their students while the others don't. This is the reason it's impossible to numerically rank DO schools because so many fall into the same category. You have state DO schools + PCOM >>>> KCU, DMU, CCOM > established DO schools >>>>>>>>>>>>>>>>>>>>>>>>>>>> New DO schools. I don't imagine you will ever get more stratified than this.
 
Nah homie. 50% of the weighting is going to something which has little variance between schools. Lots of issues with trying to quantify the match outcomes (academic/specialty) as well.

This data could be interesting to people, but the “rankings” part of it is just going to cause a flame war.

If you’re really tempted, go read up on the methodology used by US News and build from there.
 
Arbitrary, uses subjective measures in 1) how you're choosing to weigh things and 2) specialties applied for, doesn't take a lot of other info into consideration. Also using all data available could skew things in either direction as there is faculty turnover in some places, changes in curriculum, addition or removal of types of curricula, etc - as opposed to using recent data.
 
Top