On research and its utility for medical school applicants

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

NimbleNavigator

Membership Revoked
Removed
7+ Year Member
Joined
Mar 15, 2016
Messages
383
Reaction score
287
Why is research a de facto requirement for medical school and residency? Most MD's and DO's are just going to be clinicians anyway. In that capacity, all that really matters is the ability to read and comprehend research, not the ability to actually carry it out. And even those skills don't have to be terribly well-developed because, unless you're an academic physician, you don't have the time to read primary literature anyway. Don't most MD's and DO's just go to yearly conferences/take courses for CME credit to brush up on the most recent findings that are relevant to their field?

Furthermore, in my opinion research is often used as a means of carrying out nepotism in the medical field. Show me a physician's kid and I'll show you a student who has at least co-authored a paper by the end of his second year. That rings true even more if the physician is in some way affiliated with academia.

Thanks for letting me vent. Sometimes I'm afraid that my brain will explode if I think too hard about these things without venting a little.

Members don't see this ad.
 
Hardly de facto except at academic centers or research heavy schools like UTSW etc, for which reasons for research should be obvious.

That being said, like degree creep, once one person does it, everyone else feels like they need to do so too, and thus the cycle is perpetuated and festers.
 
  • Like
Reactions: 1 user
Hardly de facto except at academic centers or research heavy schools like UTSW etc, for which reasons for research should be obvious.

That being said, like degree creep, once one person does it, everyone else feels like they need to do so too, and thus the cycle is perpetuated and festers.
That makes more sense...

Speaking of degree creep, I feel like we're witnessing a gap year creep.
 
Members don't see this ad :)
It's not. Research is only a requirement for research powerhouses (i.e. Top 20) and major academic medical centers (usually those in competitive specialties).
I see. I guess I was under a wrong impression this entire time! Oh well.
 
"Requirement" question aside, being able to comprehend and think critically about research - in other words, not only understand the words and figures on the page but have the ability to judge the quality, strength, generalizability and limitations of published research - is not an easy skill to develop. It requires some level of knowledge into the research enterprise best gained through experience, mentorship that values the development of this skill set (namely, from senior scientists who actually care about being good mentors), and independent thought (ideally through coursework grounded in discussion of primary literature or through participation in journal clubs in addition to the work associated with planning your own project).

Developing this kind of critical thinking is a lifelong endeavor. I've benefited greatly from mentors who care about this sort of thing and have even observed people far and above my senior in research be criticized by colleagues (constructively) for shortcomings in this department.

For those reasons, I would argue some level of meaningful research experience (not just in the "hard" or basic sciences) is beneficial to anyone hoping to go into anything that even hopes to aspire to being called "evidence based". Of course, I don't see any reason why this should only happen during undergrad. It should be developed throughout one's training and indeed long after it; hence, "lifelong learning".
 
  • Like
Reactions: 4 users
"Requirement" question aside, being able to comprehend and think critically about research - in other words, not only understand the words and figures on the page but have the ability to judge the quality, strength, generalizability and limitations of published research - is not an easy skill to develop. It requires some level of knowledge into the research enterprise best gained through experience, mentorship that values the development of this skill set (namely, from senior scientists who actually care about being good mentors), and independent thought (ideally through coursework grounded in discussion of primary literature or through participation in journal clubs in addition to the work associated with planning your own project).

Developing this kind of critical thinking is a lifelong endeavor. I've benefited greatly from mentors who care about this sort of thing and have even observed people far and above my senior in research be criticized by colleagues (constructively) for shortcomings in this department.

For those reasons, I would argue some level of meaningful research experience (not just in the "hard" or basic sciences) is beneficial to anyone hoping to go into anything that even hopes to aspire to being called "evidence based". Of course, I don't see any reason why this should only happen during undergrad. It should be developed throughout one's training and indeed long after it; hence, "lifelong learning".

Agreed. You should get way more out of research than just "doing research."
 
  • Like
Reactions: 2 users
It never hurts to get involved in research if you have the time, especially in med school
 
  • Like
Reactions: 1 users
"Requirement" question aside, being able to comprehend and think critically about research - in other words, not only understand the words and figures on the page but have the ability to judge the quality, strength, generalizability and limitations of published research - is not an easy skill to develop. It requires some level of knowledge into the research enterprise best gained through experience, mentorship that values the development of this skill set (namely, from senior scientists who actually care about being good mentors), and independent thought (ideally through coursework grounded in discussion of primary literature or through participation in journal clubs in addition to the work associated with planning your own project).

Developing this kind of critical thinking is a lifelong endeavor. I've benefited greatly from mentors who care about this sort of thing and have even observed people far and above my senior in research be criticized by colleagues (constructively) for shortcomings in this department.

For those reasons, I would argue some level of meaningful research experience (not just in the "hard" or basic sciences) is beneficial to anyone hoping to go into anything that even hopes to aspire to being called "evidence based". Of course, I don't see any reason why this should only happen during undergrad. It should be developed throughout one's training and indeed long after it; hence, "lifelong learning".
Agreed. You should get way more out of research than just "doing research."

I think the best time to develop research skills is in medical school itself and beyond because that is the time when evidence based medicine is explored and studied. This is why medical students at Top 20 schools who didn't have prior research experience still pursue research opportunities to expand their knowledge and skillset.

I don't see the point of requiring research before applying to medical school (besides top 20). Research should be pursued only if applicants are truly interested in it.
 
"Requirement" question aside, being able to comprehend and think critically about research - in other words, not only understand the words and figures on the page but have the ability to judge the quality, strength, generalizability and limitations of published research - is not an easy skill to develop.

But why should clinicians be the ones to do this? Why not train a separate class of people (MD/PhD's maybe) to do this and just let the clinicians be clinicians?
 
But why should clinicians be the ones to do this? Why not train a separate class of people (MD/PhD's maybe) to do this and just let the clinicians be clinicians?

This deserves a much more detailed response but I'll give my short answer: you can't be a good clinician, not in the best sense that your patients deserve, without playing a part even in a small way in some level of scholarship even if all that means is reading the journals and professional society bulletins discussing treatment options and outcomes for your niche.

If you are suggesting having a dedicated scholar class of physicians who judge all work and then pass on recommendations from on high, well, that is only a couple of degrees more extreme than what we have now, let alone some decades ago. It's not a positive thing, clinical practice should be evidence based and that evidence should be open to criticism and review by all members of the community, especially those who need the information most (clinicians) to serve their only purpose (giving the best care possible to their patients). Otherwise, physicians would be nothing more than glorified technicians and it's a short distance from that point to raising the legitimate question: "why are these technicians so expensive anyway?" And so on.

The skills I am describing don't require a PhD, although they are a part of PhD training. They just require an effort to understand the scientific process. I don't think it is too much to ask
 
Last edited:
  • Like
Reactions: 7 users
I think the best time to develop research skills is in medical school itself and beyond because that is the time when evidence based medicine is explored and studied. This is why medical students at Top 20 schools who didn't have prior research experience still pursue research opportunities to expand their knowledge and skillset.

I don't see the point of requiring research before applying to medical school (besides top 20). Research should be pursued only if applicants are truly interested in it.

I disagree. I think precisely the skills I describe ought to be developed in undergrad because they are not just useful for scientists but for life. Particularly civic life. The development of a public better equipped to think critically about the world, decisions, policies, media, interpersonal relationships and so on is the only reason for higher education to exist; indeed it is necessary for a functioning participatory democracy if we mean to have one. The scientific method broadly applied can help develop this. All of that other stuff about the economy and jobs and marketable skills is secondary to that primary aim which has, admittedly, being grievously and savagely dismantled in this country to a great degree.
 
  • Like
Reactions: 1 user
So is research normally just working on whatever project a P.I wants done? Or do they expect some sort of more independent research? In that case, am I supposed to do posters? I've been having a lot of fun researching. Is it a waste of time for non top 20s?
 
Members don't see this ad :)
I disagree. I think precisely the skills I describe ought to be developed in undergrad because they are not just useful for scientists but for life. Particularly civic life. The development of a public better equipped to think critically about the world, decisions, policies, media, interpersonal relationships and so on is the only reason for higher education to exist; indeed it is necessary for a functioning participatory democracy if we mean to have one. The scientific method broadly applied can help develop this. All of that other stuff about the economy and jobs and marketable skills is secondary to that primary aim which has, admittedly, being grievously and savagely dismantled in this country to a great degree.

Those are good reasons why applicants/students should pursue research, although not necessarily at undergrad level (these can be done after graduation and in the workforce). I don't see the point of medical schools requiring research to get admitted though.
 
  • Like
Reactions: 1 user
Those are good reasons why applicants/students should pursue research, although not necessarily at undergrad level (these can be done after graduation and in the workforce). I don't see the point of medical schools requiring research to get admitted though.
Exactly. Undergraduate research is impossible. We know nothing. I can be a lab tech's assistant, but that's about it. I've heard that there are scholarship routes for me to actually do an independent project. I don't know how real that is. I can't imagine being able to do research anywhere near the level of real researchers.
 
  • Like
Reactions: 1 user
Exactly. Undergraduate research is impossible. We know nothing. I can be a lab tech's assistant, but that's about it. I've heard that there are scholarship routes for me to get actually do an independent project. I don't know how real that is. I can't imagine being able to do research anywhere near the level of real researchers.
Amen brother. Amen.

I really don't get how some people publish first author papers in undergrad.
 
In undergrad? It only REALLY matters if you're going for MD/PhD. And, apparently, top 20 schools (I never really knew.) But I didn't do it to "beef up" my application. I did it because I loved it and I wanted to publish once I got into it. If I took a gap year between college and med school, I'd work in a research lab at a medical school.

In medical school, however, it's a pretty big thing if you want to go into a (very) competitive field, along with high Step scores, good grades, excellent LORs, etc.
 
  • Like
Reactions: 1 user
Those are good reasons why applicants/students should pursue research, although not necessarily at undergrad level (these can be done after graduation and in the workforce). I don't see the point of medical schools requiring research to get admitted though.

That's just it: it's not a requirement. The questions you ask and how good you are at research don't matter. What matters is learning about the scientific process. It is almost generically true that whenever someone says, in regards to education, that something is useless that it is actually a tremendously important opportunity for actual freedom and creativity. This is nothing like research in medical school which is ideally geared towards matching your clinical interests in order to help you match into an academic or competitive residency.

So: get involved in research you find interesting, learn, and do your best to learn enough and develop enough to maybe ask your own question. Write a thesis if your department allows it. None of it is for medical school, although I'm sure adcoms find some of the traits developed desirable though by no means required of everyone, it is for *you*.
 
Exactly. Undergraduate research is impossible. We know nothing. I can be a lab tech's assistant, but that's about it. I've heard that there are scholarship routes for me to actually do an independent project. I don't know how real that is. I can't imagine being able to do research anywhere near the level of real researchers.

Impossible is a large exaggeration. Sure, undergrads aren't going to be at the level of graduate students or even techs who are out of college (and they aren't expected to be), but they can still conduct valid research. Most of the time they'll be under the guidance of the PI, not formulating their own projects anyways.
 
  • Like
Reactions: 1 users
Impossible is a large exaggeration. Sure, undergrads aren't going to be at the level of graduate students or even techs who are out of college (and they aren't expected to be), but they can still conduct valid research. Most of the time they'll be under the guidance of the PI, not formulating their own projects anyways.
Alright, so I'm doing it right then? I've been doing sample prep and data collection tasks for a project. That's all that's expected of me for research?
 
Alright, so I'm doing it right then? I've been doing sample prep and data collection tasks for a project. That's all that's expected of me for research?

If you let it be then it will. First you want to gain the best technical competence that you can (which is what you're doing). However, eventually your scientific curiosity should drive you to want to take on additional responsibilities and really push the project forward. Usually this would mean reading up on the current literature in your area of research and seeing if you can apply any of the experiments being conducted to answer an interesting question pertaining to the project you're helping out on.
 
  • Like
Reactions: 1 user
Why is research a de facto requirement for medical school and residency? Most MD's and DO's are just going to be clinicians anyway. In that capacity, all that really matters is the ability to read and comprehend research, not the ability to actually carry it out. And even those skills don't have to be terribly well-developed because, unless you're an academic physician, you don't have the time to read primary literature anyway. Don't most MD's and DO's just go to yearly conferences/take courses for CME credit to brush up on the most recent findings that are relevant to their field?

Furthermore, in my opinion research is often used as a means of carrying out nepotism in the medical field. Show me a physician's kid and I'll show you a student who has at least co-authored a paper by the end of his second year. That rings true even more if the physician is in some way affiliated with academia.

Thanks for letting me vent. Sometimes I'm afraid that my brain will explode if I think too hard about these things without venting a little.

@Officer Farva , is that you?
 
  • Like
Reactions: 1 user
Exactly. Undergraduate research is impossible. We know nothing. I can be a lab tech's assistant, but that's about it. I've heard that there are scholarship routes for me to actually do an independent project. I don't know how real that is. I can't imagine being able to do research anywhere near the level of real researchers.

Undergraduate research is not impossible. You can be a research assistant and if you have the right mindset, can even head your own project. Here's where your reasoning is wrong. A first-year graduate student is expected to do research while taking classes too. A first-year graduate student knows very little more than a senior undergraduate. If that senior undergraduate has taken a few graduate courses, he/she likely has more than enough knowledge to pursue research. How do we "professional" researchers appear to have so much knowledge? Experience and reading the literature. This knowledge doesn't appear overnight - you must cultivate it. And if we tell you not to do something because it won't work, it's because we have had experience doing that thing before and are trying to help you avoid the pitfalls, not because we come from some sort of research promised land.

Impossible is a large exaggeration. Sure, undergrads aren't going to be at the level of graduate students or even techs who are out of college (and they aren't expected to be), but they can still conduct valid research. Most of the time they'll be under the guidance of the PI, not formulating their own projects anyways.

Undergraduate students can very well be at the level of first- and second-year graduate students. It depends on how the undergraduate approaches the research - is it another box to check off or is it an interesting question to pursue? I have had undergraduates who go on to be just as productive and independent as first- and second-year graduate students. Are they expected to be? No. But could they? Yes.

Also, many graduate students don't even formulate their own projects. It's a shortcoming of academia - PIs want the research done and so they pawn it off to their graduate students. If the graduate student doesn't speak up, he/she gets stuck doing the work the PI wants done and not the work he/she comes up with. Many graduate students haven't had the opportunity to come up with a project by themselves.

Alright, so I'm doing it right then? I've been doing sample prep and data collection tasks for a project. That's all that's expected of me for research?

What's expected of you from your mentor is dedication to the research and caring about the results. This is because research is a career for us, not a checkpoint before medical school. If you care about the project, the rest will follow.
 
Why is research a de facto requirement for medical school and residency? Most MD's and DO's are just going to be clinicians anyway. In that capacity, all that really matters is the ability to read and comprehend research, not the ability to actually carry it out. And even those skills don't have to be terribly well-developed because, unless you're an academic physician, you don't have the time to read primary literature anyway. Don't most MD's and DO's just go to yearly conferences/take courses for CME credit to brush up on the most recent findings that are relevant to their field?

Furthermore, in my opinion research is often used as a means of carrying out nepotism in the medical field. Show me a physician's kid and I'll show you a student who has at least co-authored a paper by the end of his second year. That rings true even more if the physician is in some way affiliated with academia.

Thanks for letting me vent. Sometimes I'm afraid that my brain will explode if I think too hard about these things without venting a little.

Physician's kid here. Not a single paper or poster to my name. :)
 
Undergraduate students can very well be at the level of first- and second-year graduate students. It depends on how the undergraduate approaches the research - is it another box to check off or is it an interesting question to pursue? I have had undergraduates who go on to be just as productive and independent as first- and second-year graduate students. Are they expected to be? No. But could they? Yes.

Yes, you are very correct. I chose poorly by phrasing my post as an absolute.


tskbdmnd said:
Alright, so I'm doing it right then? I've been doing sample prep and data collection tasks for a project. That's all that's expected of me for research?

Don't box yourself into a mentality of doing just what is expected of you. There's always more to do in research.
 
It's not. Research is only a requirement for research powerhouses (i.e. Top 20) and major academic medical centers (usually those in competitive specialties).

If you look at the MSAR, the vast majority of accepted applicants have research (even at schools that aren't traditionally research oriented). I picked two random schools with relatively modest NIH funding, but most others have similar statistics:

Drexel
Community Service (Medical): 87%
Community Service (Non-Medical): 73%
Shadowing: 77%
Research: 85%

Albany
Community Service (Medical): 86%
Community Service (Non-Medical): 73%
Shadowing: 72%
Research: 86%

Of course it isn't mandatory, but when the large majority of successful applicants have it, you aren't doing yourself any favors by completely forgoing it.
 
  • Like
Reactions: 1 user
If you look at the MSAR, the vast majority of accepted applicants have research (even at schools that aren't traditionally research oriented). I picked two random schools with relatively modest NIH funding, but most others have similar statistics:

Drexel
Community Service (Medical): 87%
Community Service (Non-Medical): 73%
Shadowing: 77%
Research: 85%

Albany
Community Service (Medical): 86%
Community Service (Non-Medical): 73%
Shadowing: 72%
Research: 86%

Of course it isn't mandatory, but when the large majority of successful applicants have it, you aren't doing yourself any favors by completely forgoing it.

Research statistics are inflated. MSAR uses data from AAMC (including AMCAS application data). This means applicants are grossly exaggerating their research experiences to include menial tasks such as lab maintenance. Independent and productive research experience is difficult to acquire at undergraduate level, so the research statistics are misleading.

Likewise, Top 20 schools require research because they want students who will be future leaders of their field, including being top researchers in cutting-edge research. Top 20 schools can substantiate their missions easily because they have very sizable funding to support research. The same cannot be said for many other schools.

Personally, it would be much better if all medical schools essentially adopted a standard for assessing research quality similar to MSTPs. Lab maintenance, a semester or two of research running basic experiments, and/or some lab assignments really aren't valid research experiences. Understanding how science works by actually carrying out the scientific method like literature review, experiment design, data collection, data analysis + statistics, drafting up results into presentable forms etc. is really key to a good research experience. And that's what it should be.
 
Last edited:
  • Like
Reactions: 1 users
If you look at the MSAR, the vast majority of accepted applicants have research (even at schools that aren't traditionally research oriented). I picked two random schools with relatively modest NIH funding, but most others have similar statistics:

The problem with this statistic is that there is a difference between everyone having something and it being necessary. Everybody who gets into medical school has a nose. But that doesn't mean that if you chopped off your nose, you're automatically excluded from medical school. Similarly, while many people have "research" and get into medical school, that does not necessarily imply that research is necessary for medical school. The bar one must prove for that to be true is that if you remove research experience, then you do not get into medical school.

Now, this of course doesn't preclude the possibility that research is required - there simply is not enough data on this to conclude either case.
 
  • Like
Reactions: 1 users
I'll quote the wise DrMidlife on research: “you've preferably had some exposure to research so you can be convinced that Andrew Wakefield used malicious dirtbag methods and is not the savior of the world's children.”

Other than that, your point?????

Hardly de facto except at academic centers or research heavy schools like UTSW etc, for which reasons for research should be obvious.

That being said, like degree creep, once one person does it, everyone else feels like they need to do so too, and thus the cycle is perpetuated and festers.
 
The problem with this statistic is that there is a difference between everyone having something and it being necessary. Everybody who gets into medical school has a nose. But that doesn't mean that if you chopped off your nose, you're automatically excluded from medical school. Similarly, while many people have "research" and get into medical school, that does not necessarily imply that research is necessary for medical school. The bar one must prove for that to be true is that if you remove research experience, then you do not get into medical school.

Now, this of course doesn't preclude the possibility that research is required - there simply is not enough data on this to conclude either case.
Makes sense to me. I still feel like the correlation seems a little too high to not be a factor, though. But I guess we'll never know.
 
Top