Med School Admissions Process = Load of Crap

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
PHECT said:
Ladies reading this thread- Question: Does Tuba playing/Tuba players flick your switch? If so, why?


In high school, I played percussion, and we happened to march right in front of the tubas in the Homecoming parade. The bastards didn't have their little music stands, so they taped their music to my back. So having had that experience, I'd say that no, tuba players do not flick my switch. (unless they can play the flight of the bumblebee on tuba)

Members don't see this ad.
 
Yeah, I'm definitely speculating. That's why schools should do some research.

I think it is fine to want personable doctors. I'm just not sure that the interview helps select for that very much, except at the margins. Med school interviews evalulate interviewing ability, which has some weak correlation to things that are useful to measure.

As far as whether it's actuarial, you would probably know better than I would. Maybe this falls outside their domain. What I meant anyway was statistical analysis. I'm not even sure most schools take the basic step of seeing whether some interviewers are systematically more positive. And then you have the non trads, who may have unusual backgrounds more susceptible to bias. I mean, be honest, don't you think some interviewers are more open to the law to med transition then others, regardless of how you present yourself?

Attractiveness, incidentally, is one of the more benign biases that can creep in. Interviews open the way to all sorts of really nasty stuff. For example, studies consistently show that lighter-skinned blacks and hispanics are viewed more favorably, statistically speaking, than darker skinned ones. This bias holds even for people who have the best of intentions and would never think of themselves as racists.

Law2Doc said:
You are speculating -- it's equally possible that those with high interviewing scores are really the precise folks the med schools really want, and the schools are getting distracting by the numerical stats. There are enough Steve Jobs types in the world that are good because of their personal skills, not their technical skills, and there is certainly value to having folks with both in any industry. If the folks in charge of medicine wanted doctors who were just high numerical stats and science degrees, it would be easy enough to implement that. The fact that the profession has moved far away from that point with little resistance and no evidence (I've seen) of drop off in quality suggests the new model has at least equal value.
As for giving credence to some actuarial analysis, I'm not so sure from the discussion that you know what an actuary does. I've worked with a few, and they tend to crunch numbers for insurance companies and the like, to estimate risk and turn it into a dollar figure -- not so much to turn subjective things like an impressions of an interview into objective ones. An important but tedious job.
A big part of an interview is "hitting it off" with the person you are interviewing with, and that involves people skills which are not tied to a formulaic approach. It is not at all arbitrary, as a great interviewer will always come off great, and a poor one always poor. Not random - it is based on a learnable and practicable skill. And not tied to pure verbal answers -- At the extreme, you could say the exact same words in two interviews and yet have one be good and one bad. The key is practice, and reading your audience. And it's a great skill to have when you have to deal with people as a professional, so there's validity in seeking out this skill by schools.
 
good post j and good points raised. the attractiveness factor is an important subconscious consideration. many other things also factor in besides lightness of skin. and humans cant help it either even if they consciously try. the med school process does not seem in line with the scientific method at all, its much too whimsical. its a shame that a school wanting strong scientific minds engages in random selection of applicants. such behavior would never win grants.

as for steve jobs--people like jobs are one in millions. med schools cant base admissions on the premise that they are seeking jobs types. no matter what their weeding method it will not produce a bevy of steves.
 
Members don't see this ad :)
j8131 said:
Yeah, I'm definitely speculating. That's why schools should do some research.

The schools? There is actually a sizable and growing body of research relating to medical school admissions that stretches back for decades. Just keyword search PubMed and you'll see what I mean.

Shredder said:
its a shame that a school wanting strong scientific minds engages in random selection of applicants. such behavior would never win grants.

HAHAHAHAHHAHAAHA. What time of day your grant application is reviewed can dictate whether you get funded (review committees are more generous early in the day, and if your grant comes up right after lunch or at the very end you're much more likely to get screwed). There's far more whim, bias and political gaming in research grants than there is in the med school application process.
 
Been thinking a bit about this issue now. Here's my 2 cents:

The basic "problem" with med school admissions is that there are 2X as many applicants as there are seats (I'm limiting my thoughts to MD programs).

This means a few things:
-Schools have to narrow down their candidate pool dramatically
-There are probably a good many candidates who would be good doctors who still don't get in
-There are going to be a lot of pissed off, annoyed applicants, regardless of the system.


So, trying to understand things from the schools' perspective, what would you do:

Emphasize grades:
-Pros: Empirically based. Has potential to show long-term dedication (i.e. good grades over a 4 year span).
-Cons: Doesn't take into account cross-school differences in grading policies, difficulty, etc. Questions about validity of comparing candidates from different schools. Doesn't take into account other factors besides "book-smarts" that make a good doctor

Emphasize MCAT:
-Pros: Empirically based. Standardized
-Cons: Doesn't show long-term dedication. Doesn't account for naturally good test-takers inherent advantage. Doesn't take into account other factors

Emphasize volunteering/clinical ECs:
-Pros: Has potential to show long-term dedication. Ensures candidates know what they're getting into (to an extent) via past exposure
-Cons: Doesn't measure intellectual capability. Anyone can put in requisite hours if they so desire

Emphasize Interviews:
-Pros: Gives a "human" side to the process. Allows evaluation of a number of factors at once.
-Cons: Doesn't account for nervousness. A minor interpersonal disagreement can potentially disqualify an otherwise superb candidate. Interviews are not standardized, so different interviewers evaluations cannot reliably be compared.


Obviously all schools have tried to strike something of a balance between these areas of emphasis. But everyone does it a bit differently, and from the perspective of the applicant the whole thing seems very nebulous. I understand the frustration it causes, but I don't see any obvious method that's better.
 
Forget actuarials. Med schools don't want the person "statistically most likely to score highest on X exam". That's why they have interviews: to have real people meet the applicant and see how they behave. If the interviewers don't like the applicant in an interview, then patients probably won't like the applicant when he's/she's a doctor.
 
i think seeing students as investments and ranking them according to their financial prospects works. thats what actuaries do. human judgment is highly fallible. standardization and statistics are less fallible.
 
Havarti666 said:
I won't disagree with you there, but it's a tough business to parse out thousands of qualified applicants. Fortunately, the arbitrary nature of the admissions game is well known, so one just has to play along and take care of the checklist of items that will get you admitted:

- GPA
- MCAT
- Adequate volunteer work
- Adequate clinical exposure
- Adequate ECs
- Decent PS
- Positive LORs
- Postitive interview
- Casting a wide net in terms of schools

Each of these is like a bodily organ. They're all important to some extent, and your application will likely die if you remove any of them. Sure it's crap, but if you just suck it up and do your best and play along you get in somewhere. Next thing you know the whole process is just a vague, bad memory.
:thumbup:

And let's face it folks, there is SO much crap in life that reduces to simple games.

Play the games so you can get at the good stuff in between them.
 
Shredder said:
i think seeing students as investments and ranking them according to their financial prospects works. thats what actuaries do.

No, I'm pretty sure from experience that that's not really what actuaries do. Actuaries compute premiums and reserves for insurance policies covering various risks, and otherwise quantify the probability of a loss event.
 
Shredder said:
its a shame that a school wanting strong scientific minds engages in random selection of applicants.

I'm not sure why you are locked in on the notion that medicine is a purely scientific discipline. It isn't, and that's why schools are trying hard to find candidates who aren't so one dimensional. That's why non-sci majors have been a huge part of every incoming med school class for the last decade or two. That's why med schools feel the need to limit themselves to very basic science prereqs. And that's why certain EC experiences tend to come across better than others.
In fact, once you get past the first two years of medicine, for the majority who do not do research, it is really an interpersonal, patient service type career. Once you get a routine down, you can go through weeks in many branches of medicine never utilizing more than basic science, but probably cannot get away from utilizing people skills. We all know many very astute science types who are socially inept. (The TV show Beauty and the Geek is totally premised on the prevalence of this trait). Is this really what the profession should seek? (Notwithstanding last year's Penn Med winner).
Thus schools want the whole package, and to the extent they can glean interpersonal prowess or potential from an interview, that's not a bad thing.
 
They should interview an applicants friends. That way, you can better control for the nervousness factor. Also, a best friend may have better insight into a person's positive and negative attributes than the person themselves.

Of course, imagine the fight that would ensue after a candidate is rejected.
 
Law2Doc said:
I'm not sure why you are locked in on the notion that medicine is a purely scientific discipline. It isn't, and that's why schools are trying hard to find candidates who aren't so one dimensional. That's why non-sci majors have been a huge part of every incoming med school class for the last decade or two. That's why med schools feel the need to limit themselves to very basic science prereqs. And that's why certain EC experiences tend to come across better than others.
In fact, once you get past the first two years of medicine, for the majority who do not do research, it is really an interpersonal, patient service type career. Once you get a routine down, you can go through weeks in many branches of medicine never utilizing more than basic science, but probably cannot get away from utilizing people skills. We all know many very astute science types who are socially inept. (The TV show Beauty and the Geek is totally premised on the prevalence of this trait). Is this really what the profession should seek? (Notwithstanding last year's Penn Med winner).
Thus schools want the whole package, and to the extent they can glean interpersonal prowess or potential from an interview, that's not a bad thing.
i can never be unconvinced that med school admissions have gotten way too flaky. its generally agreed that theres a randomness factor/crapshoot and that doesnt seem like the way things should be. actuaries yes but its close enough, calculating this and that return. its more objective anyway, certainly not a crapshoot

as an economist i know that increasing the importance of any criteria must necessarily come at the cost of other criteria. theres no such thing as ppl who are "good at everything", ppl are always either better or worse than others. and i think the de-emphasis on academic prowess in favor of dubious subtleties is a disturbing trend in admissions. supposedly it all actually began around ww2, when jews were dominating wasps academically and thus non academic factors had to enter the admissions realm to keep things in check.

i can accept the system for what it is right now, but i cant accept arguments saying its a legitimate system. its bogus
 
Law2Doc said:
No, I'm pretty sure from experience that that's not really what actuaries do. Actuaries compute premiums and reserves for insurance policies covering various risks, and otherwise quantify the probability of a loss event.

I'm sure that they could compute the probability that a student is X% likely to fail out, pass the boards, have more medmal claims, etc based upon both hard (gpa, mcat) and soft (ec's and volunteer).
 
Members don't see this ad :)
BrettBatchelor said:
I'm sure that they could compute the probability that a student is X% likely to fail out, pass the boards, have more medmal claims, etc based upon both hard (gpa, mcat) and soft (ec's and volunteer).
But it takes eight years from the moment they are admitted into school to get to the point where they are going to have a board score, let alone malpractice claims. I don't know anyone who'd want to do or fund that sort of longitudinal study.
 
RxnMan said:
But it takes eight years from the moment they are admitted into school to get to the point where they are going to have a board score, let alone malpractice claims. I don't know anyone who'd want to do or fund that sort of longitudinal study.
I shall make it my life's work :laugh:
 
Shredder said:
i can accept the system for what it is right now, but i cant accept arguments saying its a legitimate system. its bogus
Okay, so what would you change, and how?
 
It is funny some of you are unable to comprehend what Law2doc was trying to say. While some aspects of medicine are undeniably scientific in nature, the actual delivery of medical care is very much mixed in with things that aren't so quantifiable.

Why would NIH spend millions of dollars to do research behavioral and professional traits of medical students. Note that when I say professional, they weren't talking about their IQ, grades or board scores.

Being an economist shouldn't exclude you from rational thinking, shredder.
 
Shredder said:
i think seeing students as investments and ranking them according to their financial prospects works. thats what actuaries do. human judgment is highly fallible. standardization and statistics are less fallible.
No, I know that at least at my school, they are much more interested in accepting students whom they feel will be leaders in the field some day. Leadership may extend to the private sector, in academia, or even as the only doctor in the prairie, 1000 miles from the nearest tertiary hospital. This is something not easily quantifiable.
 
RxnMan said:
But it takes eight years from the moment they are admitted into school to get to the point where they are going to have a board score, let alone malpractice claims. I don't know anyone who'd want to do or fund that sort of longitudinal study.

The nice thing is it wouldn't have to be longitudinal like a drug study (i.e. takes 20 years to get info on pts 20 yrs out) because the data is already available if not easily found. For instance, current practicing physicians have MCAT scores, board scores, and a successful malpractice claim rate. It's just a matter of getting all the info together on a sample. Now where it gets complicated is when the MCAT has been changed significantly, etc. Makes comparing the old data with the new kinda tricky.
 
RxnMan said:
But it takes eight years from the moment they are admitted into school to get to the point where they are going to have a board score, let alone malpractice claims. I don't know anyone who'd want to do or fund that sort of longitudinal study.
Now that I think about it, I have been a part of a longitudinal study for over 23 years called the CAP (Colorado Adoption Project). I retract my comment!

As stated elsewhere, the MCAT, board scores, etc., are out there, but finding them and following them up with subject interviews would still be very difficult. Getting consent would be challenging.
 
In response to a few of the posts above, I feel like several of you aren't understanding what I'm saying. I'm not arguing that medicine should be regarded as a pure scientific discipline in which only tests of academic ability matter for admission. What I'm saying is that I don't believe interviews succeed in selecting for non-academic traits (leadership, integrity, social skills, etc.) any better than using information available from applications. In fact, I think it may do a worse job and I believe in depth studies, if they don't already exist, would show that interviews are very unreliable and inaccurate measures of what they're supposed to measure.

No evidence shows that the interview is so vital to the process that it would be bad to do a long longitudinal study (nor would you need to wait 8 years before getting results). But at the least, schools should be doing their own analysis along several fronts. Perhaps this goes on, but I doubt it.

I suspect schools don't even do a rigorous analysis to see if some interviewers are consistently more positive about applicants than others. Or even if some interviewers recommend a significantly higher percentage than others do. But even if that goes on, it still doesn't show that the interviews are reliable or accurate.

By the way, many countries select docs entirely on the basis of test scores and that seems to work just fine. But I am not suggesting that. I am suggesting that it would probably be better to use the applications, possibly screening only for real outliers.
 
j8131 said:
In response to a few of the posts above, I feel like several of you aren't understanding what I'm saying. ...... I am suggesting that it would probably be better to use the applications, possibly screening only for real outliers.

I think we all understand what you are saying, just that some of us disagree. (A subsequent poster, Shredder, was the one who suggested that purely academic stats should be used). Assuming you accept the premise that purely academic credentials are not exclusively what is being sought, bear in mind that a ton of premeds are going to have vastly similar ECs. Shadowing, volunteering at hospitals, research, etc. Most can put together a decent PS with some advice and editing help. From what I've heard from adcom types, 90% of all PS are variations on about a dozen common themes, with little real originality. So there is little in the app that is going to convey the real person behind the canned premed credentials. So that really just leaves the interview as a basis for distinction, especially in terms of interpersonal skills. I see nothing random about it, and have seen a good correlation between folks with good personal skills and folks who do well at the interview stage (and the converse).
 
I dont know what you guys are talking about, schools are putting more emphasis on GPA's and MCAT scores than ever before. Just look at the matriculant data on the AAMC website from 1994 to 2005 here is a summary,


1994 Average GPA Matriculant 3.48 +/- 0.34
2005 Average GPA Matriculant 3.63 +/- 0.28

1994 Average MCAT Matriculant Section Scores

VR 9.4 +/- 1.8
PS 9.4 +/- 2.0
BS 9.6 +/- 1.9
composite ~ 28.4

2005 Average MCAT Matriculant

VR 9.7 +/- 1.8
PS 10.1 +/- 1.9
BS 10.4 +/- 1.6
composite ~ 30.2

Can you believe it stats ******, in the early ninties people that now qualify for DO school where going to MD school, oh no I guess all the MD's from those days are stupid doctors. As you can clearly see from the data presented, the schools are putting more emphasis on numbers than ever before, but there is on thing that numbers cant compensate for and that is personality and I got a hell of lot of it, so look at my numbers and weep, I am one of those "outliers" that actuaries should discredit and decide I am not a solid financial investment. Well, I hope this ends this non sense talk, numbers are more important than ever, and the averages are higher now than any other time in the past, you high numbers folks are just whinning because you cant cut the mustard once it comes around to interview day, the day when your numbers dont matter, and you have got to sell yourself to the adcoms. Work a crappy sales job for a few years and you will learn this valuable skill if your lacking it, I can promise you'll never learn it in Organic I or II.
 
Holistic said:
Can you believe it stats ******, in the early ninties people that now qualify for DO school where going to MD school, oh no I guess all the MD's from those days are stupid doctors. As you can clearly see from the data presented, the schools are putting more emphasis on numbers than ever before.
But what were the averages back in 1994? Also, since the MCAT has been around so long, don't you think Kaplan/TPR/EK/etc. have all honed the art of teaching the MCAT? So the people who have time/money/both are able to increase their scores.

What would you do? It's a buyer's market. They have tons of high scores to choose from and people who will say "How high?" when they say jump.
 
I put the averages in my previous post prowler
 
Holistic said:
I put the averages in my previous post prowler

The question, I think, was what were the averages for all applicants.

Just as kids have grown taller and heavier than their parents, the current crop of applicants have better numbers than their predecessors.

A good interview can't overcome bad numbers (very few with bad numbers get interviewed although it can happen by accident or as a courtesy to someone with connections to the school), a bad interview can negate the value of good numbers.

What it comes down to then is are the interviewers biased to favor good looks, biased against particular groups (Shredder's example of Jews years ago), etc or are interviewers only ruling out the odd balls and those who don't really want to do medicine (one applicant went so far as to say he wanted to do X (non-medical line of work) but everyone in his family is in medicine and so he had to apply!)?

When there are 40+ applications for every available spot, and half of all applicants in each round are unsuccessful at gaining admission anywhere, it will seem that there is something arbitrary about the system -- no matter what system is used.
 
Law2Doc said:
I think we all understand what you are saying, just that some of us disagree. (A subsequent poster, Shredder, was the one who suggested that purely academic stats should be used). Assuming you accept the premise that purely academic credentials are not exclusively what is being sought, bear in mind that a ton of premeds are going to have vastly similar ECs. Shadowing, volunteering at hospitals, research, etc. Most can put together a decent PS with some advice and editing help. From what I've heard from adcom types, 90% of all PS are variations on about a dozen common themes, with little real originality. So there is little in the app that is going to convey the real person behind the canned premed credentials. So that really just leaves the interview as a basis for distinction, especially in terms of interpersonal skills. I see nothing random about it, and have seen a good correlation between folks with good personal skills and folks who do well at the interview stage (and the converse).

But my whole argument is that this system doesn't work to do what it's supposed to do. So saying that the interview is the only thing left as the basis of distinction doesn't seem to address my point. Schools could rank applicants by the number of letters in applicants' hometowns, but I think we can all agree that that would not be a good way to assess the "real person".

Also, I don't agree about the apps being so similar. There are differences in extracurriculars, recommendations, essays, etc. There is, I think, far more meaningful differentiation there than in in the interviews (as conducted).

As to your last point, I agree that people with really good personal skills do well and people with bad personal skills don't. But I notice that only in the extreme cases, and I think that's a reflection of the fact that the system works at the tails.
 
Adv Health Sci Educ Theory Pract. 2004;9(2):147-59. Related Articles, Links


Investigating the reliability of the medical school admissions interview.

Kreiter CD, Yin P, Solow C, Brennan RL.

University of Iowa, College of Medicine, Iowa City, IA 52242, USA. [email protected]

PURPOSE: Determining the valid and fair use of the interview for medical school admissions is contingent upon a demonstration of the reproducibility of interview scores. This study seeks to establish the generalizability of interview scores, first assessing the existing research evidence, and then analyzing data from a non-experimental independent replications research design. METHODS: Multivariate and univariate generalizability analyses are conducted using data from a structured interview obtained from a population of medical school applicants over two years. RESULTS: The existing literature does not provide sufficient evidence regarding interview reliability. In this study, interview scores derived from a standardized interview were found to display low to moderate levels of reliability. Interview scores do not appear to possess the level of precision found with other measures commonly used to facilitate admissions decisions. DISCUSSION/CONCLUSION: Given the results obtained, the fairness of using the interview as a highly influential component of the admission process is called into question. Methods for using interview data in a psychometrically defensible fashion are discussed. Specifically, attention to decision reliability provides guidance on how interview scores can best be integrated into the admissions process.
 
j8131 said:
Adv Health Sci Educ Theory Pract. 2004;9(2):147-59. Related Articles, Links


Investigating the reliability of the medical school admissions interview.

Kreiter CD, Yin P, Solow C, Brennan RL.

University of Iowa, College of Medicine, Iowa City, IA 52242, USA. [email protected]

PURPOSE: Determining the valid and fair use of the interview for medical school admissions is contingent upon a demonstration of the reproducibility of interview scores. This study seeks to establish the generalizability of interview scores, first assessing the existing research evidence, and then analyzing data from a non-experimental independent replications research design. METHODS: Multivariate and univariate generalizability analyses are conducted using data from a structured interview obtained from a population of medical school applicants over two years. RESULTS: The existing literature does not provide sufficient evidence regarding interview reliability. In this study, interview scores derived from a standardized interview were found to display low to moderate levels of reliability. Interview scores do not appear to possess the level of precision found with other measures commonly used to facilitate admissions decisions. DISCUSSION/CONCLUSION: Given the results obtained, the fairness of using the interview as a highly influential component of the admission process is called into question. Methods for using interview data in a psychometrically defensible fashion are discussed. Specifically, attention to decision reliability provides guidance on how interview scores can best be integrated into the admissions process.

They can't be too opposed to interviews though, because two years later, the same group has a proposal as to how to make the interview process better. See-
Examining the influence of using same versus different questions on the reliability of the medical school preadmission interview.

Kreiter CD, Solow C, Brennan RL, Yin P, Ferguson K, Huebner K.

University of Iowa Carver College of Medicine, Iowa City, Iowa 52242-1000, USA. [email protected]

BACKGROUND: Researchers generally recommend a structured format for the medical school preadmission interview (MSPI). However, the relative benefits of various elements of structure remain unexamined. PURPOSE: In this study, we compared the performance of a highly structured interview format with a semistructured format. Specifically, we examined how the reliability of interview ratings is likely to change when using the same versus different questions for each applicant being interviewed. METHOD: Variance components from a generalizability (G) study of a structured interview are used in decision studies to compare the relative efficiency of using the same versus different questions for each applicant. RESULTS: Using different questions for each interviewee is practically as reliable as using the same questions for all applicants (G = .55 vs. .57, respectively). CONCLUSIONS: Because there are a number of drawbacks to using the same questions for all applicants (i.e., security and validity) and little advantage in terms of increased reliability, the semistructured question format should be considered when conducting the MSPI. A suggested method of implementing a semistructured interview is by presenting each applicant a set of questions randomly drawn from a pool of interview questions.

PMID: 16354132 [PubMed - in process]
 
j8131 said:
Investigating the reliability of the medical school admissions interview.

Kreiter CD, Yin P, Solow C, Brennan RL.

Interesting abstract. As of this year, however, it seems that Kreiter, Yin, Solow, Brennan and the gang aren't ready to give interviewing the old heave-ho. They're just attempting to refine it and make it more useful.

Teach Learn Med. 2006 Winter;18(1):4-8.

Examining the influence of using same versus different questions on the reliability of the medical school preadmission interview.

University of Iowa Carver College of Medicine, Iowa City, Iowa 52242-1000, USA. [email protected]

BACKGROUND: Researchers generally recommend a structured format for the medical school preadmission interview (MSPI). However, the relative benefits of various elements of structure remain unexamined. PURPOSE: In this study, we compared the performance of a highly structured interview format with a semistructured format. Specifically, we examined how the reliability of interview ratings is likely to change when using the same versus different questions for each applicant being interviewed. METHOD: Variance components from a generalizability (G) study of a structured interview are used in decision studies to compare the relative efficiency of using the same versus different questions for each applicant. RESULTS: Using different questions for each interviewee is practically as reliable as using the same questions for all applicants (G = .55 vs. .57, respectively). CONCLUSIONS: Because there are a number of drawbacks to using the same questions for all applicants (i.e., security and validity) and little advantage in terms of increased reliability, the semistructured question format should be considered when conducting the MSPI. A suggested method of implementing a semistructured interview is by presenting each applicant a set of questions randomly drawn from a pool of interview questions.
 
Havarti666 said:
Interesting abstract. As of this year, however, it seems that Kreiter, Yin, Solow, Brennan and the gang aren't ready to give interviewing the old heave-ho. They're just attempting to refine it and make it more useful.

Too slow. Got the cite up first. :thumbup:
 
j8131 said:
But my whole argument is that this system doesn't work to do what it's supposed to do.

How is it failing? People with exceptional numerical stats can get in if they submit a well-rounded application, even if they blow an interview or three. Hell, Shredder got in and he triumphantly possesses zero volunteering experience, something that would flambe an applicant with even marginally worse numbers.
 
Yeah, and to top those rising numbers off, you have to be ahead of those average 3.7 GPAs because supposedly medical schools are talking about "grade inflation" being a cause of the rise. So, yeah, if you believe strongly in the numbers you better have a 3.9+ GPA when applying. :p
 
It would be interesting to know how schools use interview information. First, is there more than one interviewer? This gives the adcom a chance to look at inter-rater reliability. Believe me, if the interviewers dont' agree (particularly if one interview gives the kiss of death) then there are big questions about how to interprete the results.

Second: How fine is the grading that interviewers are asked to perform? Let's say that the interviewer(s) rate students as decline (only about 2% who interview get this designation), reservations about the applicant (~15%), acceptable (70-75%), or outstanding (8-13%). Applicants go into the interview rated average, above average or outstanding based on their paper application (the below average are not interviewed). Being rated outstanding on interview is going to bump you up a peg, being rated "reservations" is going to bump you down a peg and "decline" is going to push you right out of the pool.

I would think that there is more interrater reliability among interviewers when the designations are few and the real differences (bump up & drag down) are at the margins (for more than 70% the interview neither hurts nor helps).
 
I'd be interested to know how it's failing, as well.

Med school admissions is not a "crap shoot" as a lot of people might want to think. The interview counts for a lot, too, and I think a lot of people go into the interview believing that just because they have good numbers, the school is going to drop to their knees for them. I've interviewed people that were downright arrogant, some who did not have a clue about current issues in medicine, some who were dishonest about extra-curricular activities, and some who just acted like they really didn't want to be a doctor. How hard is it to pick up a newspaper and read up on hot topics in medicine that you might be asked? How hard is it to have some manners during the interview and not act like you are doing the admissions committee a favor by applying? It's amazing to me that people work so hard to get to the interview, only to blow it off like it doesn't matter.

The process is not failing. In a lot of circumstances, it's the applicants who are failing themselves. I'm not saying that every person who didn't get in had any of the problems I described above, I'm just describing my personal experience with the process.
 
Law2Doc said:
I'm not sure why you are locked in on the notion that medicine is a purely scientific discipline.

I think Shredder is one of those folks whose entire ego rests on his/her scholastic prowess. When these people encounter situations they can't control with their GPA's, all manner of rationalization ensues. Med school rejection? Why, it's the product of a flaky admissions process conspiracy, of course. Nope, couldn't be anything else. I think this is why he once described being "furious" when he gets rejection letters.

It's also strange that he now refers to himself as an economist. To my (limited) knowledge, he possess neither a degree in economics nor a job in the field of economics.
 
Havarti666 said:
I think Shredder is one of those folks whose entire ego rests on his/her scholastic prowess. When these people encounter situations they can't control with their GPA's, all manner of rationalization ensues. Med school rejection? Why, it's the product of a flaky admissions process conspiracy, of course. Nope, couldn't be anything else. I think this is why he once described being "furious" when he gets rejection letters.

It's also strange that he now refers to himself as an economist. To my (limited) knowledge, he possess neither a degree in economics nor a job in the field of economics.
He is in his last semester of an econ degree.
 
BrettBatchelor said:
He is in his last semester of an econ degree.

My bad, then. I thought he was engineering. A double major, perhaps?
 
Shredder said:
if american business made decisions based on a random factor this country would be in shambles

Shredder said:
i think seeing students as investments and ranking them according to their financial prospects works. thats what actuaries do. human judgment is highly fallible. standardization and statistics are less fallible.

I don't know what Havarti is talking about, Shredder, you're clearly an economist. And I do have to take issue with your judgement on the relative value of statistics and people.

Human judgement and math are both usefull in predicting the future (which is, afterall what adcoms are trying to do). Math is especially helpfull when you are making a large number of predictions about well-defined outcomes (ie death, USMLE scores, income). Math is moderately helpfull when you are making a small number of predictions about well-defined outcomes. However, if you can't define the outcome, math doesn't do you any good.

Try poorly defined outcomes, for starters. Measuring happiness can be done with tools that have fairly good reliability and reproducibility, but it is still questionable WHAT they are measuring. If you can't comfortably define happiness, you can't measure it. If you can't measure it, you can't do the math.

There are plenty of qualities that are even harder to pin down, which nonetheless, society would like it's doctors to have. I prefer to have those left to human judgement. And I'm talking about touchy feely qualities, but not JUST touchy feely qualities. Say you want to assemble a medical school class that has a chance of including a student who will revolutionize medicine. I think Duke's phrase is 'capable of brilliance'. I'd bet on human judgement over the math here as well.


And, it's for a different reason, but statistics don't win in business either. The most successfull business people didn't get where they are by just doing what the actuaries told them to do. You can call it vision and leadership, or capriciousness and luck, but there is a lot more to it than the numbers.
 
Just for comparison's sake - how med admissions work abroad where my cousin is going to school:

1. You take a national test. 2. Each school has its own test which is part oral/part written/part practical. 3. They put both scores together and make a big list of applicants and pick from the top down until the class is full.

No ECs, No GPA, nothing.
Also 10x as many applicants to process.

Just for some perspective.
 
dbhvt said:
I don't know what Havarti is talking about, Shredder, you're clearly an economist.

I guess it's just me, but back when Shred has his mdapplicants profile linked I could have sworn he was an engineering major. Indeed he does talk a lot of econ lingo, but I find his conclusion so naive that I assumed he had maxed at some 200 level courses. Silly me.
 
Jaykms said:
Just for comparison's sake - how med admissions work abroad where my cousin is going to school:

1. You take a national test. 2. Each school has its own test which is part oral/part written/part practical. 3. They put both scores together and make a big list of applicants and pick from the top down until the class is full.

No ECs, No GPA, nothing.
Also 10x as many applicants to process.

Just for some perspective.
perspective would be valuable in this discussion--elaborate? that system would drastically reduce bureaucracy too. and bureaucracy leads to inefficiency and wasted resources, like docs and med students sitting around tables and yammering instead of making discoveries and eradicating diseases. yes my expressions are extreme--somebody on sdn has to do it. dont dissect everything i say down to the word, just take it as colloquialism

i switched from engr to econ this semester--good memory havarti. too good, use it for other things. im in more advanced classes now. the gist was that i think like an economist. i shouldve phrased it more clearly but i wasnt aware ppl would raise qualms about it.

dbhvt youre right that the big winners in business or anywhere dont win based on actuarial (or whatever) predictions. anybody who was that good at prediction would be on mount olympus. but predictions do work for predicting the vast majority of ppl, and thats usually what is being dealt with. no interviewers can spot a future bill gates, so its futile trying. its more likely to produce erroneous results on a large number of other people. so on and so on, maybe you get the idea. in short, most ppl even at harvard will not go on to be superstars, and no adcoms can change that no matter how much they want to.

i think everyone is just very effectively brainwashed into believing all of this hoopla. 100 yrs ago none of this rubbish existed but america was arguably on a much faster ascent than it is now. i think in terms of trends/derivatives--obviously, american healthcare is better now than it was in the distant past, just like the economy. but what about the rate of improvement? isnt that whats important?
 
Shredder said:
perspective would be valuable in this discussion--elaborate? that system would drastically reduce bureaucracy too.

So would a single-payer healthcare system, but it's just so, so... Un-American.

Shredder said:
and bureaucracy leads to inefficiency and wasted resources, like docs and med students sitting around tables and yammering instead of making discoveries and eradicating diseases.

Uh, admissions consumes a pretty miniscule portion of a given institution's resources. When you look at the lifetime investment that these institutions are making in their med students, I don't see how selecting them sight-unseen is desirable. If the number of applicants went from 2 x seats to 10 x seats, then there would be a serious need to further streamline the system. Mind you, all of the 125 domestic allopathic med school already screen their applicants pre-interview, and this process is already heavily based on quantitative measures of academic achievement.

Shredder said:
dont dissect everything i say down to the word,

Unless you stop making sweeping statements that have no demonstrable basis in reality, that is going to be hard to do.

Shredder said:
just take it as colloquialism

So we shouldn't dissect your words because they are derived from a local or regional dialect?

Shredder said:
i switched from engr to econ this semester--good memory havarti. too good, use it for other things. im in more advanced classes now. the gist was that i think like an economist. i shouldve phrased it more clearly but i wasnt aware ppl would raise qualms about it.

No qualms, just a small discrepancy. Packing away details for later recall has been my life for the past 4.5 years. It's a tough habit to break.

Shredder said:
i think everyone is just very effectively brainwashed into believing all of this hoopla. 100 yrs ago none of this rubbish existed but america was arguably on a much faster ascent than it is now. i think in terms of trends/derivatives--obviously, american healthcare is better now than it was in the distant past, just like the economy.

There's that flaky conspiracy I mentioned earlier. Yes, 100 years ago none of this admissions hoopla existed, but that's because the number of candidates for college was so small that there was no need for an admissions process. People who wanted to go, and had the means, generally filled out some admittance paperwork and showed up for class. While we're on the topic of 100 years ago, every nascent industrialized economy grows like gangbusters. In the US, however, the life expectancy at birth was about 48 (60 if you made it to age 10!), and your average person could look forward to 12+ hours a day, six days a week of backbreaking agricultural or industrial labor from a young age. Thanks, but I'll pass.

Until the late 1910's, Harvard, Yale and Princeton had foreign language requirements that precluded almost everyone except for the products of elite Northeastern private schools. When they removed the language requirements the proportion of Jewish students increased dramatically, and so the ruling Protestant hierarchy concocted the precursor to the modern American application process: assessment of character in addition to scholastic aptitude. This way they could sustain their heavily legacy-based admissions without raising too much of a fuss. Ironically, the same tools they used have been employed ever since for very different purposes.

Shredder said:
but what about the rate of improvement? isnt that whats important?

Uh, healthcare advancement today is moving orders of magnitude faster than it ever has. The largest economic expansion in the nation's history in terms of percentage increase of per-capita GDP/GNP was after World War II, and it had nothing to do with the med school admissions process.
 
Jaykms said:
Just for comparison's sake - how med admissions work abroad where my cousin is going to school:

1. You take a national test. 2. Each school has its own test which is part oral/part written/part practical. 3. They put both scores together and make a big list of applicants and pick from the top down until the class is full.

No ECs, No GPA, nothing.
Also 10x as many applicants to process.

Just for some perspective.

It's a different debate, but it's pretty clear that the US doesn't want the same systems used elsewhere, so that doesn't add any real perspective. There are very few programs here where folks are funnelled straight through from high school to med school. And there are absolutely no med schools that determine their classes based strictly on numerical scores. Since that system would be simple to implement, at great savings to the school (having adcoms spend countless hours reviewing apps and interviewing isn't cheap), it's pretty evident that the US schools must feel they get better caliber med students doing things the way they do them. The fact that they spend a lot of additional money to do it this way is testament to the fact that they feel that the current system adds value over a numerical test based plan, be it in terms of diversity, maturity, personal skills, or whatever. Obviously the current system is striving to achieve something "better" than what occurs in other countries.
It would be interesting to know how foreign physicians rate in some of the intangible, interpersonal, teamwork and leadership aspects of medicine, having not been selected for these qualities. Based solely on admissions practices in this country, one must conclude that those selected by foreign methods are felt to be somehow lacking as compared to their US counterparts.
 
Holistic said:
I put the averages in my previous post prowler
I was asking for the averages of all MCAT takers. Right now, the average matriculant has a 30, but ten years ago, it was a 28. The average MCAT taker now has a 24, so what did the average MCAT taker get ten years ago? If it was a 22, then there's nothing surprising.
 
Once again, I agree with Law2Doc, and Havarti666. Keep in mind shredder, hundred years ago you will not even be looked at as an applicant even if you had the greatest stats in the world. As Law2Doc eloquently stated, the current process is expensive and would have been switched to your proposed system (much cheaper by the way). The only problem is that Adcoms know the system you proposed is not a very good one and are willing to shed the big bucks to keep doing what they are doing (at least until they find something better).
 
Top