Pros vs. Cons of PCSAS - Is it the VHS or Betamax of accreditation systems?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

psych.meout

Full Member
7+ Year Member
Joined
Oct 5, 2015
Messages
2,603
Reaction score
2,803
I'm a little verklempt. Talk amongst yourselves. I'll give you a topic: PCSAS, is it good or bad? Discuss.

(Let's have this be the PCSAS discussion thread, as discussed in the grad school debt thread.
Grad school debt)

Members don't see this ad.
 
Last edited:
I'm thinking it is more like an eight-track cassette tape or could it be a vinyl LP which still has a place amongst the true aficionados?
 
  • Like
Reactions: 1 users
I'm a little verklempt. Talk amongst yourselves. I'll give you a topic: PCSAS, is it good or bad? Discuss.

(Let's have this be the PCSAS discussion thread, as discussed in the grad school debt thread.
Grad school debt)

Well, the Delaware Conference was like 7-8 years ago or something. So, I am going to guess they are on the lazy river schedule at this point.
 
  • Like
Reactions: 1 users
Members don't see this ad :)
Are there any unfunded PCSAS grad programs?
 
IMO: it’s a stupid idea promulgated by people who think the fields problem can be fixed by manipulating the supply of providers. Say your training is better than mine. Does that change how insurance pays? How many patients want to see you? Or the dirty secret in psychotherapy: does this have anything to do with pt outcomes?

Last time our field tried similar crap it nearly took the entire profession out of the healthcare field.
 
  • Like
Reactions: 2 users
IMO: it’s a stupid idea promulgated by people who think the fields problem can be fixed by manipulating the supply of providers. Say your training is better than mine. Does that change how insurance pays? How many patients want to see you? Or the dirty secret in psychotherapy: does this have anything to do with pt outcomes?

Last time our field tried similar crap it nearly took the entire profession out of the healthcare field.
It's poor supply side thinking at best for sure There is a lot to like about PCSAS, but it is limited to only programs in psych departments (thereby largely ignoring division 17.. a major part of psychokigists being trained) and has a primary goal of accepting programs into it which place research as their primary outcome goals for students with clinical practice second (again, not a majority of trained psychologists go research). So even if this did all the things you mention (which it doesnt), PCSAS excludes itself by design from being useful nationally.

It also doesn't use outcomes (EPPP, match, etc.) As I recall. As a requirement either. It artificially attempts to control programs thst can apply by being for profit or by program department.
 
The general theme of the problems I have with PCSAS (I'm looking at you here, McFall) is the inconsistencies involved in their arguments.

They have great disdain for Psyd programs and their students, because PsyD programs are not grounded enough in science. Not entirely a baseless argument, depending on the specific program (e.g., Rutgers), and I'm certainly not one to shy away from criticizing poor quality and expensive PsyD and PhD programs. The problem with this is that one of their criticisms of the APA is that it doesn't accredit master's-level counseling programs. There are lots of reasons why the APA does do this (I'm not taking a position on this issue either way here), but is it really logically consistent to argue against Psyd programs for their dearth of science training while also advocating in favor of master's-level counselors? Again, I'm not arguing against having master's-level providers, but I would bet good money that the average PsyD program offers more science education and training than the average counseling program. This line of argument just doesn't make sense.

Relatedly, one of the accusations they hurl at the APA, deserved or not, is that it is practicing "gatekeeping" by not accrediting master's-level counseling programs. This isn't necessarily an invalid argument on its own, but it's fairly clear that PCSAS looks down on any psychologist who would want to exclusively do clinical work of any kind. It's also obvious that their formulation of PCSAS is to turn doctoral-level psychologists into primarily researchers with clinical work as a tertiary role, at best, well behind consulting and other roles that don't involve patient-care. To them, it's beneath the profession to not be an active researcher. It kinda sounds like they want to a do a bit of their gatekeeping to me, vis a vis making PCSAS programs the only legitimate doctoral programs in clinical psych and training almost exclusively researchers.

This condescension towards the quality of training received other kinds of doctoral programs program kind of falls flat when you see the training offered at some of their programs. One in particular has an interesting perspective on didactic classwork. To meet the APA course requirements for certain classes beyond the mere four or so courses required by the program, their solution is to have students go through a selected reading list supposedly giving them a comprehensive look at the topic. How do they assess competency after this reading assignment? Literally a single, one-on-one conversation with a faculty member. There's no rubric or other objective measure of their knowledge or mastery. I'm not sure if this is unique to this particular PCSAS program or if it's more pervasive across the system, but it's still weird and slightly alarming. At the very least, I don't think this particular program that follows these practices has much ground upon which to criticize anyone else's didactic training.

I'm not saying that PCSAS has no arguments, but rather that they seem to lack some self-reflection required to examine their arguments with a critical lens.
 
  • Like
Reactions: 1 users
They’re idiots. To paraphrase a us president, it’s about the money.

1) the apa is the publisher for the majority of psychology journals?

2) most psychologists are clinicians.

3) for a bunch of scientists they sure aren’t showing a lot of data for their proposed method of teaching. Member the studies about educational level of clinician and psychotherapy outcome? Cause I do. And they might want to do some splaining.

4) they’re internally inconsistent. Show me their training which conforms to their proposed method.

5) the field struggles because there’s not enough money. Money for salaries. Money for grants. Etc. if every psychologist pulled in only $250k, we could raise membership dues to 2k, easy which would drastically change our political clout.


Overall this seems to be another instance when psychologists have some money problems and turns around to attack its own while threatening to take its ball home and not play. We did this stupid crap with Carl Rogers and almost got completely kicked I out of insurance. We threatened the apa psychiatry to create our own dsm5 because we didn’t like it, and they called our bluff which is great because it would have made insurance a no go. How stupid do we have to be?
 
Last edited:
  • Like
Reactions: 8 users
As an aside, I've been amazed at how little research is conducted on training consistency in general for the field. I assume it's because no one wants to see the answer to the question of how effective training is.
 
  • Like
Reactions: 2 users
3) for a bunch of scientists they sure aren’t showing a lot of data for their proposed method of teaching. Member the studies about educational level of clinician and psychotherapy outcome? Cause I do. And they might want to do some splaining.
Sources, please?

We've discussed quite a few of these in my first year courses in grad school, but I'm always looking for more.
 
  • Like
Reactions: 1 user
3) for a bunch of scientists they sure aren’t showing a lot of data for their proposed method of teaching. Member the studies about educational level of clinician and psychotherapy outcome? Cause I do. And they might want to do some splaining.

4) they’re internally inconsistent. Show me their training which conforms to their proposed method.
Some recent data on PCSAS

73% of PCSAS Graduates Practice
53% of PCSAS Graduates are Federally Funded Investigators
 
Some recent data on PCSAS

73% of PCSAS Graduates Practice
53% of PCSAS Graduates are Federally Funded Investigators


This data doesn't do much to support them. The stated purpose of the PCSAS is to improve the QUALITY of training for psychologist functioning as clinical scientists, relative to APA.

If the issue is improving quality, then shouldn't someone maaaybe use a between groups design? For example, why didn't take that 53% of their graduates who received federal funding for research statistic and compare it with the the 76% of graduates from APA approved Clin PhD programs who are Co-I or better using grants and contracts?

Then the essential questions become: What is the curriculum? What is the empirical basis for defining this curriculum as of higher quality? What is the empirical evidence that shows training in this curriculum improves outcomes? And accrediting using the academic equivalent of "Yeah, I agree with that guy" is not sufficient.

Showing their graduates have improved treatment outcomes would have been a home run for that purpose. Comparing the mean impact factor of their graduates' publication history to APA graduates would mean something.
 
  • Like
Reactions: 2 users
This data doesn't do much to support them. The stated purpose of the PCSAS is to improve the QUALITY of training for psychologist functioning as clinical scientists, relative to APA.

If the issue is improving quality, then shouldn't someone maaaybe use a between groups design? For example, why didn't take that 53% of their graduates who received federal funding for research statistic and compare it with the the 76% of graduates from APA approved Clin PhD programs who are Co-I or better using grants and contracts?

Then the essential questions become: What is the curriculum? What is the empirical basis for defining this curriculum as of higher quality? What is the empirical evidence that shows training in this curriculum improves outcomes? And accrediting using the academic equivalent of "Yeah, I agree with that guy" is not sufficient.

Showing their graduates have improved treatment outcomes would have been a home run for that purpose. Comparing the mean impact factor of their graduates' publication history to APA graduates would mean something.
For all their talk about empiricism and research, I don't think there is much of a basis for their curriculum being better.

As I wrote two years ago:
This condescension towards the quality of training received other kinds of doctoral programs program kind of falls flat when you see the training offered at some of their programs. One in particular has an interesting perspective on didactic classwork. To meet the APA course requirements for certain classes beyond the mere four or so courses required by the program, their solution is to have students go through a selected reading list supposedly giving them a comprehensive look at the topic. How do they assess competency after this reading assignment? Literally a single, one-on-one conversation with a faculty member. There's no rubric or other objective measure of their knowledge or mastery. I'm not sure if this is unique to this particular PCSAS program or if it's more pervasive across the system, but it's still weird and slightly alarming. At the very least, I don't think this particular program that follows these practices has much ground upon which to criticize anyone else's didactic training.
 
Members don't see this ad :)
For all their talk about empiricism and research, I don't think there is much of a basis for their curriculum being better.

What I personally can't wait for are the stories of APA-accredited counseling psychology students taking classes from PCSAS-accredited clinical programs and being told that they are inferior by design.
 
  • Like
Reactions: 1 user
For a profession, at least for most programs, that professes to train their students in empirical methods, psychology seems to have gone the way of every other healthcare profession in continuing to be terrible at outcome research when implementing new procedures.
 
  • Like
Reactions: 1 user
For a profession, at least for most programs, that professes to train their students in empirical methods, psychology seems to have gone the way of every other healthcare profession in continuing to be terrible at outcome research when implementing new procedures.
For PCSAS, it seems that they're being intentionally selective about using empirical methods and outcomes. In the aforementioned presentation given by PCSAS advocates, they were quick to provide stats for PsyD programs being inferior (e.g., debt, match rates, EPPP and licensure rates), comparative psychotherapy outcomes of master's-level providers vs. psychologists, and so on. When it came to the audience questioning the quality of their coursework and didactics, they were light on the details and stats.
 
This data doesn't do much to support them. The stated purpose of the PCSAS is to improve the QUALITY of training for psychologist functioning as clinical scientists, relative to APA.
Wow, so much hate :) (in case it isn't clear, I am joking around).

The thread was about lack of evidence, I posted some evidence. This is not conclusive evidence but at least it is a start. As the link explains, these data are simply supporting the external validity of PCSAS (and some other of internal validity). If anything, this is the logical first step (and also lowest hanging fruit).

If the issue is improving quality, then shouldn't someone maaaybe use a between groups design?
Great idea. A study to do in the future.

For example, why didn't take that 53% of their graduates who received federal funding for research statistic and compare it with the the 76% of graduates from APA approved Clin PhD programs who are Co-I or better using grants and contracts?
I am actually not sure what this is saying. Compare the grads with federal funding with what/who? I am not sure what the 76% means (are 76% of APA grads Co-I or better?).

Then the essential questions become: What is the curriculum? What is the empirical basis for defining this curriculum as of higher quality? What is the empirical evidence that shows training in this curriculum improves outcomes?
Very true. Of course, these are interesting questions to address.

And accrediting using the academic equivalent of "Yeah, I agree with that guy" is not sufficient.
Isn't that all accreditation systems? Perhaps you mean that creating an alternate accreditation system should require more than that. Well, then I still support PCSAS since it is a different philosophical position compared to APA (a more narrow one).

Showing their graduates have improved treatment outcomes would have been a home run for that purpose.
Of course, it would be great to do some between group designs. Of course, it would be great to see clinical outcomes (though, this would all be correlational and very difficult to do).

Comparing the mean impact factor of their graduates' publication history to APA graduates would mean something.
Another great idea.

To really compare PCSAS grads to APA grads, we first need to separate the clinical science program out of APA. Then compare the training outcomes (e.g., internship match, EPPP pass scores, research productivity, funding, job positions). All this will take time.
 
Last edited:
For a profession, at least for most programs, that professes to train their students in empirical methods, psychology seems to have gone the way of every other healthcare profession in continuing to be terrible at outcome research when implementing new procedures.
FWIW, there are different grades of terrible. Psychology is perhaps on the better end of terrible than many other fields.
 
  • Like
Reactions: 1 user
For all their talk about empiricism and research, I don't think there is much of a basis for their curriculum being better.
"Better" is a tricky term. How about more supportive of science? We have all discussed the lax dissertation standards that many APA accredited programs allow. I wouldn't be surprised if there is an APA accredited program that is allowing their students to train in TFT or some other pseudoscience. For me, this is a travesty.
 
What I personally can't wait for are the stories of APA-accredited counseling psychology students taking classes from PCSAS-accredited clinical programs and being told that they are inferior by design.
I don't think accreditation has anything to do with people wanting to feel superior to others.
 
To be clear, I have no stake in PCSAS. I believe (I understand that beliefs aren't worth much without evidence) that the field of clinical/counseling psychology needs to be science-orientated. APA accreditation is far too lax and waters down the entire field. I don't care what your clinical perspective may be but the best manner we have to elucidate what works (and what doesn't) is through scientific means despite their limitations.
 
  • Like
Reactions: 4 users
FWIW, there are different grades of terrible. Psychology is perhaps on the better end of terrible than many other fields.

Wholeheartedly agree, but I think we are presented with a lot of easy opportunities to improve on this and pass them by without a second thought. I have no real beef with PCSAS, if anything, I think APA accreditation is too low of a bar, with too many Byzantine regulations that have nothing to do with quality training anymore. But, we should all be advocating for good outcomes tracking.
 
  • Like
Reactions: 1 user
@DynamicDidactic If PCSAS isn't based upon empirical data, this is just episode 2987 of the narcissistic exercise of "my dad can beat up your dad". I find it particularly egregious that the people claiming to be better scientists use non-standardized assessment methods, curricula, and outcome data. Anyone who believes that peer assessment inter-rater reliability of "quality" is accurate needs to watch "America's Got Talent".

It took me about 3 minutes to separate out APA data, and I didn't graduate from one of these superior schools. The 76% figure I quoted was directly sourced from the APA accreditation data of clinical psych PhD graduates. I thought this was an analog for the clinical scientist model. APA uses a category with something like, "initial employment" and then a brief glance of the definition of that category shows something like "research= Co-I or better for research funded through grants or contracts".

I'm wholeheartedly for improvement in the field. Simple statements that X does it better is no improvement.
 
  • Like
Reactions: 2 users
@DynamicDidactic If PCSAS isn't based upon empirical data, this is just episode 2987 of the narcissistic exercise of "my dad can beat up your dad". I find it particularly egregious that the people claiming to be better scientists use non-standardized assessment methods, curricula, and outcome data. Anyone who believes that peer assessment inter-rater reliability of "quality" is accurate needs to watch "America's Got Talent".

It took me about 3 minutes to separate out APA data, and I didn't graduate from one of these superior schools. The 76% figure I quoted was directly sourced from the APA accreditation data of clinical psych PhD graduates. I thought this was an analog for the clinical scientist model. APA uses a category with something like, "initial employment" and then a brief glance of the definition of that category shows something like "research= Co-I or better for research funded through grants or contracts".

I'm wholeheartedly for improvement in the field. Simple statements that X does it better is no improvement.


Agreed, PCSAS makes a fair argument for the reasons reform might be needed. It goes off the rails with what it is suggesting. It also chooses to try and put the 800 lb bull of professional psychology back in the pen after it has been free for decades. I don't see how that is going to happen, they have all the money and more graduates.
 
  • Like
Reactions: 5 users
Agreed, PCSAS makes a fair argument for the reasons reform might be needed. It goes off the rails with what it is suggesting. It also chooses to try and put the 800 lb bull of professional psychology back in the pen after it has been free for decades. I don't see how that is going to happen, they have all the money and more graduates.

PCSAS states they are of higher scientific quality. Their curriculum isn’t empirically based. Their assessment methods are not scientifically sound. Their outcome data doesn’t show superiority over standard education. I fail to see any area in which they are superior, outside of their own self assessment.
 
  • Like
Reactions: 4 users
PCSAS states they are of higher scientific quality. Their curriculum isn’t empirically based. Their assessment methods are not scientifically sound. Their outcome data doesn’t show superiority over standard education. I fail to see any area in which they are superior, outside of their own self assessment.

I don't think they are of any higher scientific quality. The only part of the argument I agree with is that APA has not done the best job of ensuring uniform quality among programs. Beyond that, they seem to just have an agenda where they want everyone but clinical research programs to be masters level clinicians with no basis for that reasoning and no real means of accomplishing their goals.
 
If the issue is improving quality, then shouldn't someone maaaybe use a between groups design? For example, why didn't take that 53% of their graduates who received federal funding for research statistic and compare it with the the 76% of graduates from APA approved Clin PhD programs who are Co-I or better using grants and contracts?

Whoa -- 76% of grads are Co-I or better? Where's the source on this? Unless I'm misunderstanding, that seems to contradict everything I've ever heard about graduate outcome data. I've had training directors cite numbers like ~30% grads pursue a research career and ~70% pursue clinical, but I've never looked at the data myself. Can anyone provide a reference for this stuff?

From visiting the PCSAS "By the Numbers" page I came across this: PCSAS By the Numbers

97% Pass
So PCSAS graduates can be licensed in a growing number of states, but are they qualified? How do they do on state licensing exams? The latest numbers from the Association of State and Provincial Psychology Boards (ASPPB), the group that created the national licensing exam, is that 97% of PCSAS graduates pass the national licensing exam wherever they take it. The comparable figure for the entire population of students accredited by the APA or the Canadian Psychological Association or designated by ASPPB is 81%. (That 81% includes PCSAS graduates. Take them out and the number is even lower.) Similarly, PCSAS graduates do better on every subtest of the national exam.

98% Match
What about PCSAS students getting internships? Internships also are a critical part of clinical science training. According to the most recent 6-year data on internship placements by the Association of Psychology Postdoctoral and Internship Centers (APPIC), PCSAS students have an internship “match” rate of well over 90% - up to 98% depending on definitional terms – compared to under 80% for non-PCSAS students.

Those seem like meaningful statistics regarding the relative quality of clinical training at PCSAS programs. That seems like an important place to start, no?
 
Last edited:
  • Like
Reactions: 1 users
Whoa -- 76% of grads are Co-I or better? Where's the source on this? Unless I'm misunderstanding, that seems to contradict everything I've ever heard about graduate outcome data. I've had training directors cite numbers like ~30% grads pursue a research career and ~70% pursue clinical, but I've never looked at the data myself. Can anyone provide a reference for this stuff?

I believe he's quoting data from people who self-identify as having gone into a research career in their initial employment. The vast majority of PhD grads are doing primarily clinical work.
 
@Sanman

I agree with your assessment of their initial argument. However, that entire introduction is a distraction. Classic marketing technique. Create a problem, offer the only potential solution. It's why infomercials start with, "Have you ever...." or "Are you tired of..." There are multiple solutions to a defined problem. The quality of the solution is not predicated upon the definition of the problem.

@beginner2011 I am partially wrong in that statistic. In my rush, I conflated the definition of "research" for faculty with absence of such a definition of research for new PhD grads. My bad. The APA ARO 2018 data shows that 76 % of graduates of APA approved Clinical PhD programs reported being engaged in research. There is no mention of CO-I or better, or how funded, for initial employment of graduates. Sorry about that. The general point remains: There are data sources that would allow between group comparisons to support PCSAS's assertion that they are of higher quality.

edit: The stats are not necessarily supportive of their position.

1) PCSAS is specifically for clinical PhDs. Not counseling PhDs, not School Psych PhDs, Not EdDs, not PsyDs (I guess there are types of psyds now?), etc. So it is troubling to say their clinical PhDs perform better on a test than the pool of counseling, school, psyd, etc graduates. No kidding. So do most clinical PhD programs. Presenting in that way is about as valid as me saying PCSAS is a bunch of dumb dumbs because they scored less lower than University of Wyoming PhD graduates. Funny how the better scientists don't know that.... Then again, my program's pass rate and mean subject scores were higher than Harvard graduates.

2) The co-variable of reputation in APPIC match rate is not addressed. Arguably reputation is easier to actually define than "quality".
 
Last edited:
@beginner2011 I am partially wrong in that statistic. In my rush, I conflated the definition of "research" for faculty with absence of such a definition of research for new PhD grads. My bad. The APA ARO 2018 data shows that 76 % of graduates of APA approved Clinical PhD programs reported being engaged in research. There is no mention of CO-I or better, or how funded, for initial employment of graduates. Sorry about that. The general point remains: There are data sources that would allow between group comparisons to support PCSAS's assertion that they are of higher quality.

It doesn't tell you much to do a between groups comparison using different outcomes, though. I think there's a big difference between being "engaged in research" and being a Co-I or better on a federally funded grant. Also, I wonder what that statistic would be if all programs who would qualify for PCSAS accreditation were withheld from the APA sample.
 
  • Like
Reactions: 1 users
I don't think accreditation has anything to do with people wanting to feel superior to others.

In a previous thread you said:

I think PCSAS may eventually become the preferred accreditation for top tier academic medicine positions.

I may be just a lowly predoctoral intern, but I would think that preferential treatment of any kind would naturally result in feelings of superiority to the group who isn't receiving access to the same career opportunities because of where they chose to study. If it will be true that certain positions will become "PCSAS-only" positions, then it would be natural to feel pride/superiority to people who could have, but don't have the same opportunities. To say otherwise is to suggest that PCSAS plans to fill their doctoral programs with something other than humans.

However, that wasn't actually my point. My point was to reinforce what @PsyDr said about the curriculum. My experience in graduate school was that clinical/counseling programs took a lot of the same classes in the psychology core. I can easily imagine that scenario continuing regardless of whether the clinical program was accredited by PCSAS or not. To me, it seems laughable that counseling programs would be denied these opportunities of some arbitrary divisions created to edge out the competition in so called "top tier positions."

Like others, I totally support accreditation reform. I just don't think PSCAS is the answer.
 
  • Like
Reactions: 1 user
It doesn't tell you much to do a between groups comparison using different outcomes, though. I think there's a big difference between being "engaged in research" and being a Co-I or better on a federally funded grant. Also, I wonder what that statistic would be if all programs who would qualify for PCSAS accreditation were withheld from the APA sample.


You're absolutely right. As scientists, the onus is on PCSAS to show support for their hypothesis. This includes defining the matter at hand (ie., quality), showing the empirical basis of their educational system, and showing that this method of instruction improves defined outcomes. PCSAS has the benefit of being able to see ASPPB data, APA data, etc. They can absolutely choose to collect data in that method that allows comparison. They're not doing any of these things. Why?

PCSAS are either bound by their own assertions, or they're claiming they are above their own assertions. The latter might be evidence that their self assessment is based upon nothing. It's amazing to me that they have gotten this far.
 
@Sanman

I agree with your assessment of their initial argument. However, that entire introduction is a distraction. Classic marketing technique. Create a problem, offer the only potential solution. It's why infomercials start with, "Have you ever...." or "Are you tired of..." There are multiple solutions to a defined problem. The quality of the solution is not predicated upon the definition of the problem.


I agree with everything you said. Your argument is that their solution to the problem has no merit as their statements are not based on scientific evidence, but the same random conjecture they criticize. I buy that.

I am saying they also seem to be picking a fight they are not going to win. The best they can hope for is an even smaller group of programs that have even less of a say about what happens than the APA. What stops clinically focused PhD and PsyD programs together with APA from excluding PCSAS from things like clinical licensing or internship? They only have equivalency approved in 7 states and that is with minimal opposition. All that will do is screw their grads.
 
Last edited:
  • Like
Reactions: 3 users
I'm wholeheartedly for improvement in the field. Simple statements that X does it better is no improvement.
To keep in narrow in scope, at least improving training (which would hopefully improve the field). What is the alternative? APA CoA has not changed and continues to water down the role of science in a doctoral psychology degree. Change from the inside hasn't happened (is what old people tell me). I dipped my toe into APA recently and learned it is not for me.

Philosophically, I agree with PCSAS. I hope to see empirical data as more programs join PCSAS. Apparently, there is a manuscript in the works but I know very little about the content.
 
  • Like
Reactions: 1 user
they seem to just have an agenda where they want everyone but clinical research programs to be masters level clinicians with no basis for that reasoning and no real means of accomplishing their goals.
I don't know if that is accurate or at least hyperbolic. I think PCSAS wants all programs to be scientifically rigorous.
 
  • Like
Reactions: 1 user
To keep in narrow in scope, at least improving training (which would hopefully improve the field). What is the alternative? APA CoA has not changed and continues to water down the role of science in a doctoral psychology degree. Change from the inside hasn't happened (is what old people tell me). I dipped my toe into APA recently and learned it is not for me.

Philosophically, I agree with PCSAS. I hope to see empirical data as more programs join PCSAS. Apparently, there is a manuscript in the works but I know very little about the content.

I find their philosophy borne of hypocrisy. If I thought they actually abided by their own principles, I would consider what they have to say.
 
1) PCSAS is specifically for clinical PhDs. Not counseling PhDs, not School Psych PhDs, Not EdDs, not PsyDs (I guess there are types of psyds now?), etc. So it is troubling to say their clinical PhDs perform better on a test than the pool of counseling, school, psyd, etc graduates. No kidding. So do most clinical PhD programs. Presenting in that way is about as valid as me saying PCSAS is a bunch of dumb dumbs because they scored less lower than University of Wyoming PhD graduates. Funny how the better scientists don't know that.... Then again, my program's pass rate and mean subject scores were higher than Harvard graduates.

2) The co-variable of reputation in APPIC match rate is not addressed. Arguably reputation is easier to actually define than "quality".
I keep racking my brain to understand what you are trying to say. When you say "their PhDs" compare to "most clinical PhD" you are conflating the same group of students. I think the goal of PCSAS is to capture most of the traditional clinical PhD programs. Currently 43 programs, I imagine this will continue to grow (probably to include University of Wyoming, my program [Midwest] is in the process of applying).

I think we all understand that Harvard is a poor example for this point.

I've agreed with you before that the counseling programs exception is weird and its also not totally clear if that is true.
 
I find their philosophy borne of hypocrisy. If I thought they actually abided by their own principles, I would consider what they have to say.
I hope that future empirical work will back up their goals. However, it is difficult to compare until a decent sized group of programs has joined PCSAS, left APA, and been member for a little while.
 
Last edited:
Agreed, PCSAS makes a fair argument for the reasons reform might be needed. It goes off the rails with what it is suggesting. It also chooses to try and put the 800 lb bull of professional psychology back in the pen after it has been free for decades. I don't see how that is going to happen, they have all the money and more graduates.
They being APA?

If one was going to attempt to capture that bull, it seems education is the best place to start. That or suing poor clinicians.
 
I may be just a lowly predoctoral intern, but I would think that preferential treatment of any kind would naturally result in feelings of superiority to the group who isn't receiving access to the same career opportunities because of where they chose to study. If it will be true that certain positions will become "PCSAS-only" positions, then it would be natural to feel pride/superiority to people who could have, but don't have the same opportunities. To say otherwise is to suggest that PCSAS plans to fill their doctoral programs with something other than humans.
My argument is in reverse. Existing feelings of superiority lead to creating exclusive clubs (e.g., accrediting bodies, different types of degrees, licensing boards, licensure exams).
However, that wasn't actually my point. My point was to reinforce what @PsyDr said about the curriculum. My experience in graduate school was that clinical/counseling programs took a lot of the same classes in the psychology core. I can easily imagine that scenario continuing regardless of whether the clinical program was accredited by PCSAS or not. To me, it seems laughable that counseling programs would be denied these opportunities of some arbitrary divisions created to edge out the competition in so called "top tier positions."
Denied is a strong word. And I purposefully used the term "preferred" in my previous statement. I wonder if there are already divisions among different types of programs when it comes to positions. We know that about 50% of newly minted licensed psychologists come from PsyD programs. Do we see a similar ratio of PsyDs in top tier AMCs? How about clinical psychology PhDs in UCCs or are counseling psychologists more prevalent?

I am basing this on anecdotal observations and obviously all the degrees are represented in most settings. However, it seems like there is already a bias based on the degree type.
 
  • Like
Reactions: 1 users
I find their philosophy borne of hypocrisy. If I thought they actually abided by their own principles, I would consider what they have to say.

What is the philosophy that you're referring to? As far as I can tell PCSAS is attempting to differentiate types of clinical training (e.g., university-based PhD training vs. for-profit PsyD training) via accrediting bodies under the assumption that PCSAS training standards will produce clinicians who are at least as competent as trainees from non-PCSAS programs. I know there's an implication that PCSAS trainees may be more fit for other positions, but I'm not seeing a specific philosophy outlined anywhere.
 
@DynamicDidactic hopefully they’ll actually use an empirical basis to create their curriculum. And hopefully they will move away from inter rater reliability of qualitative ratings from members of their club as evidence of Superior quality.

@beginner2011 Their philosophy is essentially that they are better scientists. But then they don’t back up anything with scientific principles. Like a tv preacher caught hiring a prostitute: they don’t abide by their own rules and admonitions. In what is essentially their white paper: Psychologists are not national leaders in mental health because insufficient science training (Empirical evidence: none; Pragmatic evidence: none, especially ignores the entire head of NIMH). Better science training (empirical evidence: zero empirical basis of such a curriculum; pragmatic evidence: of the curriculum isn’t derived from an empirical basis, they’re just making it up based upon personal experience which isn’t how science works). Better science training will result in better creation of empirical research and the dissemination thereof ( empirical evidence: none; pragmatic evidence: none). We can provide “better” training (empirical evidence: none; pragmatic evidence: maybe, but if they can’t define terms like better they’re not exactly abiding by the scientific principles they claim to be better at). Creating a new accreditation system is the way to go (empirical evidence: none; pragmatic evidence: none, no explanation for how further fragmentation of psychology will increase the fields ability to become leaders).
 
  • Like
Reactions: 1 user
They being APA?

If one was going to attempt to capture that bull, it seems education is the best place to start. That or suing poor clinicians.

They being other clinical psychology programs (especially professional PsyD programs). This is the equivalent of MD/PhDs saying that physicians are not empirical enough and they will start accrediting medical schools that provide MD/PhDs and MD programs are not good enough. What stops the rest of the billion dollar medical system from just ignoring them?
 
Last edited:
  • Like
Reactions: 1 user
I find it sad that PCSAS exists. Sad as in, it makes me feel sad for our profession. As many have said here, there is no evidence that clinicians from PCSAS training programs have better therapy outcomes. If the PCSAS programs hope to convince the public that their graduates are better therapists, I think they have a long battle ahead of them. If they want to convince the majority of licensed psychologists that they are inferior clinicians, I would refer them to the story of the star-bellied sneetches.
 
Denied is a strong word. And I purposefully used the term "preferred" in my previous statement. I wonder if there are already divisions among different types of programs when it comes to positions. We know that about 50% of newly minted licensed psychologists come from PsyD programs. Do we see a similar ratio of PsyDs in top tier AMCs? How about clinical psychology PhDs in UCCs or are counseling psychologists more prevalent?

I am basing this on anecdotal observations and obviously all the degrees are represented in most settings. However, it seems like there is already a bias based on the degree type.

When I applied for internship, I got the advice to steer clear of programs that "preferred" clinical to counseling psychology students. Friends of mine heard about their applications being tossed out of desirable VAs because they were not the "preferred" type of psychology student. Drawing that distinction works semantically, but fails practically. If an abundance of a "preferred" candidates exist for "x" spot, it will not matter if another candidate is just as competitive on other markers (pubs, grants, etc...). the representativeness heuristic plus time pressure would suggest that it's cognitively less taxing to simply go with what you know.

You have a bit of an is/ought problem here also. Applying that logic to gender and career choices, you might as well say: "Well, women are usually teachers and men are usually engineers, so it makes sense to create systems to codify what's already natural." Jordan Peterson might like that. But, I think the rest of us would have problems. In this case, though it might be true that counseling psychologists tend to go one way, PsyDs another way, and PhDs in Clinical psychology yet another way. It's far more shaky ground logically to suggest that it should or ought to be that way. The danger being that saying something should be one way or another further silos psychology in a groups of haves and have-nots, which in my opinion from what I've seen thus far, is one of our field's bigger problems.

I completely agree that accreditation reform needs to happen. My hope is that PCSAS is the Ross Perot of accreditation systems, in that it serves as a wake up call to APA that they will be no longer respected as a scientific organization if they continue to down-play the importance of scientific training in doctoral programs.
 
  • Like
Reactions: 1 users
I attended a balanced, fully funded PhD program which offers solid research training but mostly turns out clinicians. I worry about what PCSAS would do to these types of programs--they're probably not research-oriented enough to get accredited by them, but I also don't want them lumped in with less research-y PsyD programs.
 
  • Like
Reactions: 3 users
@beginner2011 Their philosophy is essentially that they are better scientists. But then they don’t back up anything with scientific principles. Like a tv preacher caught hiring a prostitute: they don’t abide by their own rules and admonitions. In what is essentially their white paper: Psychologists are not national leaders in mental health because insufficient science training (Empirical evidence: none; Pragmatic evidence: none, especially ignores the entire head of NIMH). Better science training (empirical evidence: zero empirical basis of such a curriculum; pragmatic evidence: of the curriculum isn’t derived from an empirical basis, they’re just making it up based upon personal experience which isn’t how science works). Better science training will result in better creation of empirical research and the dissemination thereof ( empirical evidence: none; pragmatic evidence: none). We can provide “better” training (empirical evidence: none; pragmatic evidence: maybe, but if they can’t define terms like better they’re not exactly abiding by the scientific principles they claim to be better at). Creating a new accreditation system is the way to go (empirical evidence: none; pragmatic evidence: none, no explanation for how further fragmentation of psychology will increase the fields ability to become leaders).

Creating better scientists than whom? My understanding is that they're hoping to encourage all university-based PhD programs to adhere to PCSAS guidelines, which leaves out basically for-profit institutions. I think there's a lot of data that would substantiate the claim that university-based PhD programs that prioritize strong research training in statistics and research methods do produce better scientists than for-profit institutions.

If the PCSAS programs hope to convince the public that their graduates are better therapists

The two major outcomes of training programs that we always talk about here on this board are internship match and licensure. That seems like an indicator of quality of training, no? According to their website: Overall, clinical psychology programs report less than 80% match rates, whereas PCSAS programs match at over 90%. Overall, about 81% of graduates pass the licensure exam, whereas PCSAS graduates pass at about 98%.

When I applied for internship, I got the advice to steer clear of programs that "preferred" clinical to counseling psychology students. Friends of mine heard about their applications being tossed out of desirable VAs because they were not the "preferred" type of psychology student. Drawing that distinction works semantically, but fails practically. If an abundance of a "preferred" candidates exist for "x" spot, it will not matter if another candidate is just as competitive on other markers (pubs, grants, etc...). the representativeness heuristic plus time pressure would suggest that it's cognitively less taxing to simply go with what you know.

You have a bit of an is/ought problem here also. Applying that logic to gender and career choices, you might as well say: "Well, women are usually teachers and men are usually engineers, so it makes sense to create systems to codify what's already natural." Jordan Peterson might like that. But, I think the rest of us would have problems. In this case, though it might be true that counseling psychologists tend to go one way, PsyDs another way, and PhDs in Clinical psychology yet another way. It's far more shaky ground logically to suggest that it should or ought to be that way. The danger being that saying something should be one way or another further silos psychology in a groups of haves and have-nots, which in my opinion from what I've seen thus far, is one of our field's bigger problems.

I completely agree that accreditation reform needs to happen. My hope is that PCSAS is the Ross Perot of accreditation systems, in that it serves as a wake up call to APA that they will be no longer respected as a scientific organization if they continue to down-play the importance of scientific training in doctoral programs.

Are you suggesting that a primary reason that PsyDs from for-profit institutions don't receive as much NIH funding as graduates from university-based PhD programs is because grant reviewers are making an is/ought fallacy?
 
Creating better scientists than whom? My understanding is that they're hoping to encourage all university-based PhD programs to adhere to PCSAS guidelines, which leaves out basically for-profit institutions. I think there's a lot of data that would substantiate the claim that university-based PhD programs that prioritize strong research training in statistics and research methods do produce better scientists than for-profit institutions.



The two major outcomes of training programs that we always talk about here on this board are internship match and licensure. That seems like an indicator of quality of training, no? According to their website: Overall, clinical psychology programs report less than 80% match rates, whereas PCSAS programs match at over 90%. Overall, about 81% of graduates pass the licensure exam, whereas PCSAS graduates pass at about 98%.



Are you suggesting that a primary reason that PsyDs from for-profit institutions don't receive as much NIH funding as graduates from university-based PhD programs is because grant reviewers are making an is/ought fallacy?


IMO, your groupings are too simplified and leave out a giant swath of training programs. PCSAS encourages the clinical science model of training. Most professional PsyD programs follow the practitioner-scholar model of training per their mission statements. That leaves out all the scientist-practitioner PhD/PsyD clinical programs and counseling psychology programs. The moderates, if you will, that make up the bulk of training programs.
 
  • Like
Reactions: 1 user
Are you suggesting that a primary reason that PsyDs from for-profit institutions don't receive as much NIH funding as graduates from university-based PhD programs is because grant reviewers are making an is/ought fallacy?

That's a straw man. Reread my post and get back to me.
 
Last edited:
Top