New article on publication productivity and the APPIC match

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

futureapppsy2

Assistant professor
Volunteer Staff
Lifetime Donor
15+ Year Member
Joined
Dec 25, 2008
Messages
7,641
Reaction score
6,376
Whilst searching for articles on publication productivity, I came across this new OnlineFirst article from Training and Education in Professional Psychology :

Publication Productivity of Professional Psychology Internship Applicants: An In-Depth Analysis of APPIC Survey Data.

http://psycnet.apa.org/psycinfo/2015-57073-001/

Points of potential interest:

-About half (44.2%-54.2%) of applicants in any given cycle have one publication, about a quarter to two-fifths have multiple publications, about 10% have more than five, about 2% have 10 or more, and less than 1% have 15+ publications.
-The percent reporting any given number of publications dropped when the question started specifying "peer-reviewed" or "referred" publications. This was also true in the transitional year when both questions were asked.
-About one-fifth to one-quarter of applicants had at least one book chapter.
-PhD students were considerably more likely to have published than PsyD students (68% v. 17%)
-Students who had published at least one article had match rates about 10% higher than those without any peer-reviewed articles.
-Internship directors rated the importance of number of publications in interview invites and ranking at about a 2-2.5 on a five point-scale. About 13%-23% of internship directors rated it a 4 or 5 in importance.
-22.9% rated the importance of quality of publications as a four or a five.

This seemed SDN-relevant--any thoughts? It's always such a shame to me that APPIC won't release their full dataset for analysis.

Members don't see this ad.
 
  • Like
Reactions: 2 users
It's really hard to make sense of it without the full dataset. It doesn't seem like it was an issue that came up at all during interviews for me as no one asked about my research. That is an oddity if they are indeed considering it important. way too many confounds (psyd have lower research rate and Lower match rates, for instance) so I would wanna dig it more
 
It's really hard to make sense of it without the full dataset. It doesn't seem like it was an issue that came up at all during interviews for me as no one asked about my research. That is an oddity if they are indeed considering it important. way too many confounds (psyd have lower research rate and Lower match rates, for instance) so I would wanna dig it more
It actually makes perfect sense to me that only a subset of about 15%-20% of training directors would care a lot about publication number, because when half or more of applicants don't have any publications and only 10% have 5 or more, I think those applicants are going to self-select to sites that really value publication productivity (AMCs, more research-y VAs, etc). Most clinically-oriented sites aren't going to value that as much and may attract fewer applicants with academic goals. Plus, you have the fact that only about 40% of your applicants have multiple publications, so for a large portion of the applicant pool, culling by number of publications won't be that useful in narrowing down applicants, simply because most will have none or just one or two.

Other studies have found that publication number significantly predicts match rate even when other variables are controlled for, but those have their own limitations (all clinical PhD students, self-selected sample with higher than average publication numbers, etc).
 
Members don't see this ad :)
It's really hard to make sense of it without the full dataset. It doesn't seem like it was an issue that came up at all during interviews for me as no one asked about my research. That is an oddity if they are indeed considering it important. way too many confounds (psyd have lower research rate and Lower match rates, for instance) so I would wanna dig it more

Not sure why it is hard to make sense of this information.

Granted it has its limitations, but it provides data.

My takeaway is that I will tell all of my students to publish as much as they can feasibly accomplish, and let the chips fall where they may. I would tell them that for some sites it will matter more, and for some sites it will matter less. Regardless, it will be more likely for a TD to say "this applicant does not have sufficient publication numbers to warrant an interview" versus "this applicant has too many publications to warrant an interview."
 
  • Like
Reactions: 2 users
Not sure why it is hard to make sense of this information.

Granted it has its limitations, but it provides data.

My takeaway is that I will tell all of my students to publish as much as they can feasibly accomplish, and let the chips fall where they may. I would tell them that for some sites it will matter more, and for some sites it will matter less. Regardless, it will be more likely for a TD to say "this applicant does not have sufficient publication numbers to warrant an interview" versus "this applicant has too many publications to warrant an interview."
It might not even be the TD, it could be the folks in charge of a track. If I see an app with zero posters or pubs, it goes in my "don't interview" pile.
 
It's really hard to make sense of it without the full dataset. It doesn't seem like it was an issue that came up at all during interviews for me as no one asked about my research. That is an oddity if they are indeed considering it important. way too many confounds (psyd have lower research rate and Lower match rates, for instance) so I would wanna dig it more

As someone on the other side, though--There's not enough time in interviews to ask about everything. The things that are already clear from the interviewee's application and CV (e.g., number of supervision hours and publication numbers) rarely get asked about, even when they are very important to us. They still play a role in our decisions.
 
My takeaway is that I will tell all of my students to publish as much as they can feasibly accomplish, and let the chips fall where they may. I would tell them that for some sites it will matter more, and for some sites it will matter less. Regardless, it will be more likely for a TD to say "this applicant does not have sufficient publication numbers to warrant an interview" versus "this applicant has too many publications to warrant an interview."

That's not the take-away to me.

You're thinking publications -> better internship match.

I'm seeing this as a third-variable issue. People who go to programs in which they publish a lot are going to funded PhD programs. By and large, data (EPPP, match rates) demonstrate that those programs also provide superior clinical training. So, the data are suggesting that by and large people who go to strong training programs have strong match results.
 
As someone on the other side, though--There's not enough time in interviews to ask about everything. The things that are already clear from the interviewee's application and CV (e.g., number of supervision hours and publication numbers) rarely get asked about, even when they are very important to us. They still play a role in our decisions.

That's a really good point. I'm at a site with a pretty research-focused internship and I never asked how many publications someone had. That was an important factor when deciding who to interview, but by the interview stage itself it's not important anymore--if they didn't have enough pubs, they likely wouldn't be interviewing.
 
That's a really good point. I'm at a site with a pretty research-focused internship and I never asked how many publications someone had. That was an important factor when deciding who to interview, but by the interview stage itself it's not important anymore--if they didn't have enough pubs, they likely wouldn't be interviewing.
I think this is most likely at research heavy sites--besides, there's no point in asking an interviewee how many publications they have when you have that data beforehand on their CV. It would make more sense to ask them about specific publications, their research in general, how their research fits into your site, etc.

I agree with @madeincanada that more publications (assuming that they aren't in pay to publish outlets or whatever) are either going to be neutral or help, and I would encourage students who are interested in AMC internships especially to try to be productive. Actually, I rhink having had the experience of publishing is an important part of learning about research, so I think it's something that all students would benefit from being exposed to.

I'm surprised that no one else has commented on the decrease in publication numbers when the item asked about "peer-reviewed publications" specifically. I would have thought that was a given but apparently not.
 
I'm surprised that no one else has commented on the decrease in publication numbers when the item asked about "peer-reviewed publications" specifically. I would have thought that was a given but apparently not.
Chapters or increasingly alternative media outlets are often low hanging fruit that faculty delegate to students often. I don't really give much thought to non peer reviewed stuff.
 
Chapters or increasingly alternative media outlets are often low hanging fruit that faculty delegate to students often. I don't really give much thought to non peer reviewed stuff.
Yeah, that's kind of why I'm surprised that so many people did. I have a few professional newsletter articles, book reviews/commentaries, encyclopedia entries, etc., but I always assume that when people talk about "publications," they mean peer-reviewed stuff only.
 
Yeah, that's kind of why I'm surprised that so many people did. I have a few professional newsletter articles, book reviews/commentaries, encyclopedia entries, etc., but I always assume that when people talk about "publications," they mean peer-reviewed stuff only.
It's broad, just like "scholarship" is broad :)
 
That's not the take-away to me.

You're thinking publications -> better internship match.

I'm seeing this as a third-variable issue. People who go to programs in which they publish a lot are going to funded PhD programs. By and large, data (EPPP, match rates) demonstrate that those programs also provide superior clinical training. So, the data are suggesting that by and large people who go to strong training programs have strong match results.

No. I'm thinking publications -> better chance of getting internship interviews (at some level >0.01%).
 
Members don't see this ad :)
It might not even be the TD, it could be the folks in charge of a track. If I see an app with zero posters or pubs, it goes in my "don't interview" pile.

Same. It is even more of a red flag if that person says they want to then pursue a fellowship (neuro, rehab, health, forensic, primary care) bc it shows they don't have a good grasp on what is needed to be competitive.
 
I'm saying

better program ---> better chance of interviews
\---> more publications

i.e., the solution for someone who goes to a program with a 10% match rate is not, "get 15 pubs."
Otoh, wouldn't it behoove someone who from a program with a lower match rate to be the best candidate possible (publish, get a good number of hours and integrated reports, etc)? Even FSPS produce competitive applicants.
 
Tough to make too much of the data, but pretty much fits with my perspectives. Wouldn't expect publication history to be "very important" for any but AMCs and larger VAs, which are usually seen as desirable but ultimately a small subset of existing internships. As others have stated, distribution of the variables also affects things. For most places it is probably a nice bonus, but not necessary. Very few internships will have expectations of 5+ pubs.

I think the truth is likely some combination of what MCParent and MadeInCanada are discussing above. Program clearly matters. I do think its even more important for folks at programs with historically low match rates to go above and beyond in proving themselves across the board - so a better publication record may be one way to partially counteract a "program effect." The onus shouldn't be on the student to mitigate their programs poor quality, but those who choose to attend such programs should try and make the best of it. Being a stand-out student at a lousy school still probably places you in a worse position than an average candidate at an average school based on what we know of the match data. However, it is assuredly better than being a poor-average student at a lousy school. That said, it would also be extremely difficult to publish at a poor program since the faculty usually aren't productive and they lack the research infrastructure to produce at that rate. 15 publications is more than many of the faculty members will have at FSPSs.
 
As someone on the other side, though--There's not enough time in interviews to ask about everything. The things that are already clear from the interviewee's application and CV (e.g., number of supervision hours and publication numbers) rarely get asked about, even when they are very important to us. They still play a role in our decisions.
Thats true for sure. Interviews are far too short.

It actually makes perfect sense to me that only a subset of about 15%-20% of training directors would care a lot about publication number, because when half or more of applicants don't have any publications and only 10% have 5 or more, I think those applicants are going to self-select to sites that really value publication productivity (AMCs, more research-y VAs, etc). Most clinically-oriented sites aren't going to value that as much and may attract fewer applicants with academic goals. Plus, you have the fact that only about 40% of your applicants have multiple publications, so for a large portion of the applicant pool, culling by number of publications won't be that useful in narrowing down applicants, simply because most will have none or just one or two.

Other studies have found that publication number significantly predicts match rate even when other variables are controlled for, but those have their own limitations (all clinical PhD students, self-selected sample with higher than average publication numbers, etc).
It makes sense for the number of sites that stress that although I was surprised at the number of sites (both research intensive VAs and AMCs as well as others who align to a scholar-practitioner model) that did not inquire at all about research. I don't think a single site I interviewed at asked any questions about research products, process, etc. Talking with other applicants, this seemed pretty common place as most folks that did the interviewing seemed to have very little idea of who they were talking to (having not seemingly have read the C.V., letters, or anything else). For those sites, it seems that if that is a factor that is worth considering in the process of admission they would want to ask about it (at least once or in some capacity). That seems to not be the norm from what I could tell.

Not sure why it is hard to make sense of this information.

Granted it has its limitations, but it provides data.

My takeaway is that I will tell all of my students to publish as much as they can feasibly accomplish, and let the chips fall where they may. I would tell them that for some sites it will matter more, and for some sites it will matter less. Regardless, it will be more likely for a TD to say "this applicant does not have sufficient publication numbers to warrant an interview" versus "this applicant has too many publications to warrant an interview."
I'm not entirely certain that the later is rare though. I suspect its quite the opposite. I've found research is viewed as a zero-sum issue with regard to clinical skills versus research skills. I'm skeptical if not fully doubtful of this, but it seems as though that is a pervasive under-current of training philosophy in many areas. This seems to build on the assumption that hours result in an equal and proportional amount of training.
 
I don't think a single site I interviewed at asked any questions about research products, process, etc.

I suspect the fact internship is structured as a clinical yr, it is less of a focus. I think of it like GREs or GPAs during grad school apps...many treat it as a way to cut down on apps, not necessarily select an applicant.

Talking with other applicants, this seemed pretty common place as most folks that did the interviewing seemed to have very little idea of who they were talking to (having not seemingly have read the C.V., letters, or anything else).
I think it is somewhat site dependent, as I was at a site where the faculty selection committee knew the apps well bc they went through a multiple reading and selecting process, but other faculty joined in later so it was a crapshoot. We didn't interview a zillion applicants bc we tended to match very well, so that helped.
 
  • Like
Reactions: 1 user
I definitely know of sites (AMCs, VAs) that ask about research during the interview, so they do exist. Those may be in the small subsample that really value publication productivity and quality, though.

One thing to keep in mind is that there's a huge range between zero publications and 15-20+ publications, and people with 15-20+ publications are very rare overall (according to the above article, <1% of internship applicants; .1-.2% for 20+). So, while an FSPS student (or a vast majority of non-FSPS students, tbh) may never be able to get 15 or 25 publications (or even 5) by internship application time, they may be able to get 1 or 2 and that may help distinguish them somewhat. Like Ollie said, they will still probably be at a notable disadvantage coming from an FSPS, but making an effort to mitigate by having a strong(er) CV is only going to help.
 
  • Like
Reactions: 1 user
I (my library) still doesn't have access to the full article, so I haven't seen it all. Anyone want to post it ;)

I'm seeing this as a third-variable issue. People who go to programs in which they publish a lot are going to funded PhD programs. By and large, data (EPPP, match rates) demonstrate that those programs also provide superior clinical training. So, the data are suggesting that by and large people who go to strong training programs have strong match results.
Or perhaps applicants that publish are more likely to be all around more high quality applicants (clinical work, letters of rec, written materials, etc.), independent of program.

I would think its a combination of program and individual characteristics. Of course, this could be answered empirically.
 
  • Like
Reactions: 1 user
Or perhaps applicants that publish are more likely to be all around more high quality applicants (clinical work, letters of rec, written materials, etc.), independent of program.

Ding ding ding! This is more what I see. The applicants with pubs and presentations that are in the top of our list also by and large have a higher than average number of clinical hours, both assessment and intervention. I think for many it's more a mark of a fairly high functioning, all round strong student.
 
  • Like
Reactions: 1 user
Or perhaps applicants that publish are more likely to be all around more high quality applicants (clinical work, letters of rec, written materials, etc.), independent of program.
Clearly, though I wouldn't call it "independent of program." Someone who applies to grad school laser-focused on research is going to be applying to a specific type of program.
Basically--a system and individual thing with multiple levels.
 
I (my library) still doesn't have access to the full article, so I haven't seen it all. Anyone want to post it ;)


Or perhaps applicants that publish are more likely to be all around more high quality applicants (clinical work, letters of rec, written materials, etc.), independent of program.

I would think its a combination of program and individual characteristics. Of course, this could be answered empirically.
If you google it, you can go through researchgate and request it from the authors.
 
Clearly, though I wouldn't call it "independent of program." Someone who applies to grad school laser-focused on research is going to be applying to a specific type of program. Basically--a system and individual thing with multiple levels.

The authors of the study seem to share this interpretation:
"Alternatively, applicants may self-select to some degree, with more productive applicants choosing to apply to sites that put a greater emphasis on publications. Thus, these sites may be more able to differentiate between applicants based on number and quality of publications and not just the mere presence or absence of publications when selecting and ranking interviewees."

Another tidbit that I found interesting:

"Thus, it appears that PsyD applicants, who make up approximately two-fifths to half of internship applicants (APPIC 2010a, 2011a, 2014a), may considerably deflate the percentage of total applicants who have published, especially those who have published multiple journal articles. This suggests that the commonly made statement that “zero is the modal number of publications for internship applicants,” although true, may not hold as much weight for internship applicants from PhD programs."

I guess I'll stop quoting the article now, lest I run into some copyright issue. ;)
 
"Thus, it appears that PsyD applicants, who make up approximately two-fifths to half of internship applicants (APPIC 2010a, 2011a, 2014a), may considerably deflate the percentage of total applicants who have published, especially those who have published multiple journal articles. This suggests that the commonly made statement that “zero is the modal number of publications for internship applicants,” although true, may not hold as much weight for internship applicants from PhD programs."
I think is an important point. Granted, even among PhD internship applicants the mode is 1, but it reinforces that publishing is actually somewhat the norm in PhD programs (as it should be--I think it provides important insight into how research is critiqued and disseminated).
 
Some labs churn out the papers. In my opinion, a lot of graduate school productivity is heavily dependent on the publication environment of the lab. In many labs, regardless of student quality, they aren't going to be publishing ten papers.

I know a few graduate students who have actively sought out collaborators from other universities (both other students and faculty) and gotten quite a few publications from that. Lab and program can definitely be an important factor, but it's possible to go beyond even that.
 
  • Like
Reactions: 1 user
I know a few graduate students who have actively sought out collaborators from other universities (both other students and faculty) and gotten quite a few publications from that. Lab and program can definitely be an important factor, but it's possible to go beyond even that.

This is why attendance and participation at conferences and involvement with student organizations within speciality areas are both necessary aspects of training. Sometimes collaboration does occur until after grad school, but making connections can help down the road.
 
  • Like
Reactions: 1 user
I know a few graduate students who have actively sought out collaborators from other universities (both other students and faculty) and gotten quite a few publications from that. Lab and program can definitely be an important factor, but it's possible to go beyond even that.
Agreed 100%. My lab isn't productive and any pub that comes out of it requires me to organize and write it. That's fine; I enjoy it - for the most part. However, it is incredibly time-consuming and it is so nice to collaborate, so making connections at other institutions has been a god-send for me. I wouldn't have most of my pubs if it weren't for seeking out collaborators outside of my lab.
 
Top