I've never seen such a quote/statement about a specific individual, much less a pervasive presence of such statements. I think most people that post on here are trying to be helpful. As far as moderation and civility, I think he does a pretty good job. But, I do find it funny that a few of you seem to use my name like it's a curse word when you get all bent of shape about things I say that aren't directed at anyone in particular.
This is what biases are all about Jon ... didn't they teach you that at your fabled R1 program?
People point out how they have felt your comments are dismissive and insulting and your response is to insult them and dismiss the observation.
Ergo = bias.
Not a perfect sample at all; not even a good one really.
Yes, of course Jon,
all observational data is invalid, irrelevant, and inappropriate. I'm sorry, but to what profession do you belong?
I agree that there are different levels of validity for data, but not every behavioral phenomenon has been studied with double-blind replicated studies. A core component of behavioral sciences does include
observational skills.
Yes, the referenced discussion groups are self-selected but do represent a pretty expansive cross section of post-docs from top-tier R1s to professional schools and everything in between. What I don't recall from you, on the other hand, are any references to ANY external data sources other than your own experiences (or your personal cohort)!
Once again Jon, you seem oblivious to your own statements. You rather ::ahem:: dismissively stated, and I quote: "The test is easy. It's psych 101 material."
I don't know if it's a good test.
If a practicing psychologist doesn't know if his profession's licensing exam is a 'good' test, that would seem to be enough of a reason to spend some time examining its utility. (Time for full disclosure, Jon -- are
you a licensed psychologist?)
But, confusion from a subset of people taking the test doesn't suggest it's a bad test. It's not really relevant. I'm sure if you puruse bar exam boards, you'll see similar griping. . . or USMLE's or massage therapist exams. . . or beauty school exams. There's always someone failing or someone who's confused.
I'll skip you (again) spewing your actor-observer bias and straw man arguments. Psychology is
supposed to be the profession which
creates valid assessment instruments. (You might want to review the amicus briefs in
Ricci v. DeStefano, the New Haven firefighter discrimination case recently decided by the US Supreme Court.)
And this goes beyond "griping." The vast majority of people who take the EPPP (who go on to
either pass
or fail) report that they left the exam totally twisted around and oblivious to how they did. Is this a sign of a good exam? Do the USMLE/bar/massage therapist/beauty school exam takers leave with a similar feeling?
(Oh, and real subtle dig at those expressing concerns by comparing the EPPP to "massage therapy" and "beauty school" exams.)
I think the test itself is obviously passable by most people that go through graduate school for psychology. This doesn't speak to validity.
No the ASPPB's own information admits that it is
not. In fact, in the most current
Information to Candidates, they admit to utilizing the very methodology you've criticized here -- surveying practicing psychologists asking them if they felt the exam covered materials they felt are germaine to the profession.
In an earlier version, they acknowledged that true validation is impossible because failing to pass the exam means one cannot practice psychology removing a core "treatment" group from comparative analysis.
You might want to consider:
Sharpless, B.A. & Barber, J.P. (2009). The Examination for Professional Practice in Psychology (EPPP) in the era of evidence-based practice.
Professional Psychology: Research and Practice, 40(4). 333-340.
ABSTRACT: Professional psychology has increasingly moved toward evidence-based practice. However, instruments used to assess psychologists seeking licensure, such as the Examination for Professional Practice in Psychology (EPPP), have received relatively little empirical scrutiny. Therefore, the authors evaluated the available evidence in support of the EPPP's validity and current use as a core component of professional licensure. Although the EPPP has in many ways been extensively evaluated, there is a paucity of criterion, predictive, and incremental validity evidence available. Further, several aspects of the content validation studies were examined, and the authors question whether the EPPP, as currently constructed, can meet its stated goals. Given that the EPPP is a high-stakes examination and given the authors' best estimate (based on a sample of 16 states) that 35% of applicants fail the examination, it is recommended that the EPPP be more extensively evaluated. An outline of major decision points in this proposed evaluation process is provided, several suggestions for further research are proposed, and the field is encouraged to discuss these issues further.
Finally ... there are many ways of dealing with test anxiety Jon (as any good psychologist would know) -- relaxation techniques, rehearsal, spaced studying, to name a few.
But -- again -- you miss the point.
People aren't saying they are freaking out
during the exam.
What remains the "x factor" is how best to prepare for an exam which seems to have no rhyme or reason.
We go back to how many people come out
from the exam with no clue as to their performance.
Of course, there will always be marginal (or anxious) test takers who will leave feeling a bit confused.
But when both people who pass or fail have little idea as to how/why they arrived at that outcome -- something seems amiss.