Physics & Radbio

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
Dear Radiation Oncology Department Chairs and Program Directors:

Following publication of the scores for the July 2018 radiation oncology qualifying (computer-based) exams, we have had a number of inquiries from candidates and program directors who suggest that there are some significant misconceptions regarding exam development, pass/fail score determination, and candidate performance. Although ABR trustees and staff will be available to discuss these issues at the upcoming ADROP meeting in San Antonio on October 20, we believe that the concerns expressed merit some clarification prior to that date.

Aside from a number of ongoing logistical improvements, the qualifying exam development process has remained essentially unchanged for many years. Individual exam items (questions) for the clinical exams are created by members of eight clinical category committees. All of these volunteers are clinically active and include individuals in academic and private practice, department chairs, and program directors, all of whom work directly with residents. The physics exam is developed by radiation physicists who are all active in the clinic. The radiation and cancer biology exam committee consists of scientists who are all members of clinical departments and physician/scientists who maintain active clinical practices in addition to their research activities. Before any item is added to the inventory for subsequent use, it is reviewed for clarity by staff editors. Items are also “tested” without scoring on exams before they are later used as actual scored items. This testing process is used to assure reliability of the items.

Each winter, all items anticipated for inclusion in the exams to be administered later that year are given a final review by a panel of clinicians, physicists, and biologists for accuracy, clarity of language, and relevance. Final determination is made entirely by active clinicians. Items not felt to meet those criteria are either discarded or set aside for revision and subsequent reconsideration. Once an item is accepted for use, it is scored using a criterion-based reference system in which the score represents a judgment as to the number of test takers who would get the item correct (not should get it correct). An example of this critical difference might be an item listing choices of spinal cord tolerance with conventional fractionation, where we all might believe that 100% of test takers should get the item correct, but also presume that some will give an incorrect response. Thus, the criterion-based score represents the lowest common denominator of test takers. Individual item scores are then totaled, and an average is calculated for the number of items in the exam. This average represents the line above which candidates will pass the exam, and below which they will fail.

We believe that in addition to 2018 pass rates for first-time exam takers, the 2018 passing standards, along with the passing standard ranges for the past five years, would provide useful context. The passing standard average percent correct represents the line at or above which candidates will pass the exam, and below which they will fail:

2018 Pass Rate
2018 Passing Standard
5-Yr Passing Standard Range
Clinical radiation oncology
97%
61%
61%-69%
Medical physics for radiation oncology
70%
71%
63%-71%
Radiation and cancer biology
74%
68%
64%-73%


Furthermore, the variance from the average passing standards over five years for the biology passing standard is exactly the same as the average over the past five years. We are aware that the physics and biology pass rates are below those attained on the clinical radiation oncology exam and suggest that the lower scores in physics and biology (in contrast to clinical) may be multi-factorial. However, the ABR exam delivery process has been unchanged and all scores are verified multiple times prior to publication.

One factor in the lower physics and radiobiology scores may be a lack of clarity regarding up-to-date reference study sources, and heterogeneity of teaching standards across programs. We are working with our committee chairs to develop a more useful study and reference guide for trainees and are encouraging stakeholders in the academic radiation oncology community to create greater consensus regarding curriculum content and teaching methodologies.

The ABR stands ready to work with stakeholders to assure that trainees are receiving the optimum preparation for their future careers and that we are continuing to appropriately assess their understanding of that material. We appreciate your support in this regard.



Sincerely,

Lisa A. Kachnic, MD Valerie P. Jackson, MD
ABR President ABR Executive Director


Sent from my iPhone using SDN mobile
 
For current PGY-5s who failed biology or physics ... Your PD just received an email from the ABR. Please ask him or her to share its contents. The 70% pass rate for physics and 74% pass rate for biology are now confirmed by the ABR

I've been out of the loop for awhile but if I remember correctly the exams were back to back one day but if you fail either section do you fail the entire exam and have to redo both sections?

It's safe to say that not every one of the 26% who failed biology also failed physics . . . so the overall pass rate was at best 70% (everybody who passed physics also passed biology) but most likely far lower and as low under 50% (most who failed only failed one section)?
 
I guess they really don’t care that the pass rate on each exam was 4 standard deviations below the norm over the past 15 years and that half of our class will have failed one of these two exams. Also looks like they won’t admit fault or do anything about it, ¯\_(ツ)_/¯ Ramsesthenice was 100% correct

Looks like it’s just back to the books


Sent from my iPhone using SDN mobile
 
For current PGY-5s who failed biology or physics ... Your PD just received an email from the ABR. Please ask him or her to share its contents. The 70% pass rate for physics and 74% pass rate for biology are now confirmed by the ABR

Wow. This is a complete disaster.
 
It’s absolutely insane that the ABR is trying to blame programs and residents. We all know that is a complete lie. Now that we have the pass rates and hard data it’s time to demand a retake. The ABR needs to own this one.
 
It’s absolutely insane that the ABR is trying to blame programs and residents. We all know that is a complete lie. Now that we have the pass rates and hard data it’s time to demand a retake. The ABR needs to own this one.

Or ask for your individual raw score, all we have are quartiles.
 
Why would they not compare the pass rates (not the passing standards) from 2018 to the previous 5-years? Who cares what the passing standard is when the questions are different year-to-year? What prevented this from happening last year? Did all programs become suddenly 20% worse at radiation biology and physics?
 
Or ask for your individual raw score, all we have are quartiles.

It’s particlary odd that that pass cut off rate this year (% correct) was on the high side (relative to past years) but the pass rate was lower than ever. That means someone thinks the exam was slightly easier. So instead of re-evaluating the Angoff scores they just fail more people ????!!!!!
 
At the risk of being accused of interweaving different issues into this thread I will make a prediction. For medical students considering radiation oncology, this result is much more tangible than "a bad job market". They notice that the board pass rate is lower (much lower) than other disciplines. This combined with "the bad job market" will lead to noticeable reductions in applicants from US medical schools. Probably not this year (many are already to far into the mess) but in the next 2-3 years. Medical students you have been warned.
 
Okay, well now we have the data. These numbers are absolutely ridiculous. It doesn't matter that test questions were crafted by experts and clinicians over years and committees. That high of a percentage of intelligent test takers in our field should not have failed the exam, period. If they are admitting that there was insufficient updated study material for us to prepare with, then that was the problem. That was an unfair disadvantage and we should not be penalized for it. That email does not in any way end the conversation. I am very grateful that they are working to improve things for future years, but that is not what we are interested in. The fact that the test scores were as they were demonstrates a fundamental problem this year. We need to push them on the question of what are they going to do to fix the problem this year.
 
It is 'multifactorial' but assuredly, one of those factors could not be an issue with the exam they administered? Is it more likely that:

1. Over the course of ONE year, every residency program in the country got significantly worse at teaching rad bio and physics (as they have implied as a factor)
2. Over the course of ONE year, all residents in the field got dumber (as they have implied as a factor). Even though this trend was not seen on any in-service exams with current PGY-5's massively underperforming compared to prior classes
3. There was an issue with the test that the ABR administered

How can they continue to deny that they may have played a role when it is very likely they played the biggest role?
 
Last edited:
At the risk of being accused of interweaving different issues into this thread I will make a prediction. For medical students considering radiation oncology, this result is much more tangible than "a bad job market". They notice that the board pass rate is lower (much lower) than other disciplines. This combined with "the bad job market" will lead to noticeable reductions in applicants from US medical schools. Probably not this year (many are already to far into the mess) but in the next 2-3 years. Medical students you have been warned.

I think it has already affected this year. What are my chances thread is dead quiet. Number of medical students doing rotations in our department is down.
 
Again, can our PDs and Chairs please come together and help this years cohort of residents? We are not any less intelligent. We have performed well on all standardized tests in prior years. Our in service exams went fine. Our departments continue to teach the same material. We all studied the recommended resources. We know how to test. The ABR is at fault. We need the help of PDs and Chairs.
 
I just sent a long reply to Dr. Wallner and Dr. Kachnik. I Against I remove thanking him for publishing the results, as of course is what should be expected. I then went on to say that we greatly appreciate all of the massive effort that clearly goes into developing the test questions over a long period of time involving many individuals. Clearly, however, the problem is not that the questions were poor questions or that they were poorly worded, which it is largely what he tries to explain away in his letter, but rather this was a test that did not test what we have been crafts for for the last 3 years. The test change, that is clear and the ABR does not deny that. That change occurred without any chairman, program director, physics instructor, biology instructor or resident being told. We study for a test, and were then provided a different test. That is fundamentally unfair. We greatly appreciate all of his efforts to try and make things better for the coming year, but our year was wrong and hurt and that needs to be fixed. Again, they worked very hard to produce a thoroughly vetted test, but that test was different from what we were told it was going to be and from what we were trained to know. That is a problem, and that needs to be fixed. I would recommend people send Dr. Wallner and Kachnik emails of somewhat similar content. I would again really encourage your program directors and chair people to do the same. It is simply unfathomable that that high of a percentage of intelligent people should fail a qualifying exam.
 
I just sent a long reply to Dr. Wallner and Dr. Kachnik. I Against I remove thanking him for publishing the results, as of course is what should be expected. I then went on to say that we greatly appreciate all of the massive effort that clearly goes into developing the test questions over a long period of time involving many individuals. Clearly, however, the problem is not that the questions were poor questions or that they were poorly worded, which it is largely what he tries to explain away in his letter, but rather this was a test that did not test what we have been crafts for for the last 3 years. The test change, that is clear and the ABR does not deny that. That change occurred without any chairman, program director, physics instructor, biology instructor or resident being told. We study for a test, and were then provided a different test. That is fundamentally unfair. We greatly appreciate all of his efforts to try and make things better for the coming year, but our year was wrong and hurt and that needs to be fixed. Again, they worked very hard to produce a thoroughly vetted test, but that test was different from what we were told it was going to be and from what we were trained to know. That is a problem, and that needs to be fixed. I would recommend people send Dr. Wallner and Kachnik emails of somewhat similar content. I would again really encourage your program directors and chair people to do the same. It is simply unfathomable that that high of a percentage of intelligent people should fail a qualifying exam.

Did you get her email from the ASTRO directory?
 
Did you get her email from the ASTRO directory?
Dr. Wallner has been responding to all of my emails to the ABR information feedback portal. I just keep replying to his messages and update the heading of the email to include his name. So far this has worked and I have been getting responses from him after approximately two to three days
 
It is 'multifactorial' but assuredly, one of those factors could not be an issue with the exam they administered? Is it more likely that:

1. Over the course of ONE year, every residency program in the country got significantly worse at teaching rad bio and physics (as they have implied as a factor)
2. Over the course of ONE year, all residents in the field got dumber (as they have implied as a factor). Even though this trend was not seen on any in-service exams with current PGY-5's massively underperforming compared to prior classes
3. There was an issue with the test that the ABR administered

How can they continue to deny that they may have played a role when it is very likely they played the biggest role?

Exactly. In a respectful and moderate way, please communicate that to the ABR, and have your program director and chairman communicate to them as well. They need to hear our voices en mass.
 
This is all good. Let's develop the ideas, and then we must get those ideas out there to the ears that need to hear them
 
I strongly urge people to let this sink in before sending reactionary, and possibly poorly constructed responses. Again, it’s the PDs and Chairs that we need to come help us out.
 
Agreed, you have to always be polite and respectful. Firm, but respectful. And of course also agree on getting PDs and Chairs in action.
 
Dr. Wallner has been responding to all of my emails to the ABR information feedback portal. I just keep replying to his messages and update the heading of the email to include his name. So far this has worked and I have been getting responses from him after approximately two to three days

Thanks, that is what I’ve been doing as well so I will continue the dialogue. FWIW I did take physics and clinical back to back and yeah I can’t say that was fun. Also I passed clinical so clearly I did not get magically less dumb in the 24 hours between exams. My first go at physics was totally on me, I screwed that one up, but this is very different. I contacted my former PD who said they’d be at ADROP this year.
 
I agree-- persistence and the buy in of PD's and Chairs is what we need. My last post was strongly worded but only meant to emphasize the point within this forum. All communication going to the ABR (Wallner, Kachnic, etc) should be respectful and certainly will be from my end.
 
At the risk of being accused of interweaving different issues into this thread I will make a prediction. For medical students considering radiation oncology, this result is much more tangible than "a bad job market". They notice that the board pass rate is lower (much lower) than other disciplines. This combined with "the bad job market" will lead to noticeable reductions in applicants from US medical schools. Probably not this year (many are already to far into the mess) but in the next 2-3 years. Medical students you have been warned.

This is certainly dire for our field if residency positions start filling with FMGs and lower tier medical students. Saddest part is that all these wounds (residency expansion, boards) are self-inflicted.
 
This is certainly dire for our field if residency positions start filling with FMGs and lower tier medical students. Saddest part is that all these wounds (residency expansion, boards) are self-inflicted.
Things often come full circle. That's pretty much what rad onc was in the 70s and 80s, even into the mid to late 90s with the bad job market
 
Last edited:
Does anyone think that it might be helpful to write one unified letter with a single shared message that is signed by, something like, X number of concerned pgy5 residence, send to all of the members of the leadership of the ABR?
 
Your current ABR board is now made up of these people.

Also, "lack of clarity regarding up-to-date reference study sources, and heterogeneity of teaching standards across programs [...]" is grownup talk for "the folks in charge of physics and radbio gave the exam to their residents. We'll share with you next year if you keep quiet and give us more money. "

You're trying to negotiate with the vice president of 21st century oncology. A company that went bankrupt for medicare fraud and breach of patient records. He's now charged with helping the ABR protect everyone. You're asking a caporegime for a favor.


Yup, one of the most corrupt organizations in oncology, placed their guy in charge of our Boards. Think about that folks!!!
 
Does anyone think that it might be helpful to write one unified letter with a single shared message that is signed by, something like, X number of concerned pgy5 residence, send to all of the members of the leadership of the ABR?

I do, but only if it included a majority of PGY5s, let’s say 70-74% of them. Can’t just be those of us who failed. ARRO could help organize that. More importantly, it should also be signed by the majority of PDs (and maybe chairs?). I’ve heard of a few PDs w/o failures who are just as concerned. The solidarity would be nice.


Sent from my iPhone using SDN mobile
 
From the current ACGME radiation oncology program requirements - V.C.2.c).(1) Sixty percent of the program’s graduates from the preceding five years taking the American Board of Radiology certifying examination for the first time must pass.

Effective 2019 - V.C.2.c).(1) Sixty percent of a program’s graduates from the preceding five years taking the American Board of Radiology qualifying (written) examination for the first time must pass.
 
We need to push them on the question of what are they going to do to fix the problem this year.

We are. We have drafted a response letter and I know for a fact that other programs have as well. But again, all evidence points to the fact they are not going to do anything for this years cohort. Look at the data. The cut point for physics has been as low as 63% correct in the past 5 years. Using their formula they decided that the pass line should be 71% which is the highest pass line used in the last 5 years. They saw the pass rate and were unmoved to reconsider. They could have moved the pass line to the 5 year median and chances are that would have taken care of half the failures. But they did nothing.

This is my personal bias, but I strongly suspect they are trying to flex their muscles to make a point. I know a few people who current write questions (and this is not on them, this disaster is above their pay grade) and there have been grumblings for years that the pass rate is too high and residents should know more about physics and biology. If I am right, all of this uproar may actually be what they had in mind all along. I honestly don't think that they intend to keep the pass rate this low going forward.

I feel for you. I agree that what happened to you guys this year is not fair. I don't want to tell you that life is not fair, but the reality is there is no one to hold them accountable for their decision. You are extremely unlikely to get the justice you think you are entitled to.
 
If we want this years test to be re-evaluated by the ABR, what are the varying ways we can do that? I would propose one solution below. The ABR has been adament that because they haven't changed the process of developing these tests, the tests could not be at fault. I would argue that the only universal thing between all residents you can point to for sure is that we took the same test-- which to me would indicate that it could at least possibly play a role in the 3-4+ standard deviation from average. That is more likely than 40+ different departments all of a sudden getting worse at teaching rad bio/physics individually in one year. We deserve to have this further evaluated, so one suggestion (which should be feasible and would allow the ABR to prove that this years test was not the issue) is below:

If the breakdown is ~40% old/recycled questions and ~60% new questions--Separate the questions from this year's test into exactly that. Look at the 40% that were used in previous years. Take the 5 year average of when that question was used, and the percentage of residents that got it right. Then look at the % of residents that got it right this year. Are we 3-4 standard deviations below the average in these questions? If so, then this proves the ABR's point that it wasn't the testing material. However, if we are within the range of prior years (meaning that the issue is with the 60% that is new questions) this points to an issue with this years test.

The cut off scores merely represent the percentage of questions that the ABR thinks a competent resident should get right. What if they were off on their assessment this year of how many individuals should/would get those questions right? 60% of the test is new questions. The best thing we can do is come to the ABR with reasonable solutions that would help fix the problem for this year's residents. If the test was at fault, we should not be punished for it. It is perfectly reasonable for us to ask that they provide us with data to support their claim. All we are asking for is a fair and honest review that no one can argue with, and this would provide that for us.
 
From the current ACGME radiation oncology program requirements - V.C.2.c).(1) Sixty percent of the program’s graduates from the preceding five years taking the American Board of Radiology certifying examination for the first time must pass.

Effective 2019 - V.C.2.c).(1) Sixty percent of a program’s graduates from the preceding five years taking the American Board of Radiology qualifying (written) examination for the first time must pass.

Can you provide evidence that confirms they are making this change? With everything else that has gone on, basing program certification on the written first time pass rate instead of/in addition to the oral pass rate would basically confirm the rumor that the changes in the exam this year were done to put the squeeze on smaller/community programs. That we were used as pawns to exercise their agenda. They want to turn this into a majority MD/PhD field and increase the scope and reach of the large academic centers. If our careers and clinical training are compromised, so what? Ends justify the means?

I wish I could say that I was shocked that the ABR sent such a ridiculous letter out to the PDs. I am not. It's insulting. PDs are not your typical cable news-viewing Americans. Anybody with even borderline intelligence can see how lame their efforts are to try and explain away the pass rate this year as normal. They only reported the rates for first-time test takers. Why? Include the people repeating the test and the numbers are even lower (this was confirmed to me). They did not include the actual pass rate of the test (people who failed any section of the test). Why? It's a lot lower, probably in 50% range as most people didn't fail both sections. The other numbers are nonsense distractors. Why not include the 5 year pass rate and standard deviation? Why not provide a histogram of scores? Their dishonesty is obvious to the point of insult. It's out of our hands as residents now. Hopefully this letter has enraged PDs enough to make waves.

1 in 2 residents failed to pass an exam to test minimally competent knowledge of physics and biology for clinical practice. Residents that applied when rad onc was one of the most competitive specialties and were top 10% of their med school class.

Tomorrow is September 15th. Med students about to submit an application to rad onc -- apply to something else instead. All other med students, run away from this field. Medgator may actually get his wish as the spots 5-10 years down the road go to IMGs, first time failure rates at programs skyrocket >50% due to actual poor resident quality, and programs get shut down. The remaining programs then dedicate an enormous amount of time and effort teaching biology and physics to the PhD level at the expense of clinical training lest they get shut down as well.
 
Last edited:
Look at the data. The cut point for physics has been as low as 63% correct in the past 5 years. Using their formula they decided that the pass line should be 71% which is the highest pass line used in the last 5 years.

This was exactly as I hypothesized. Due to the ridiculous new content in the bio exam, they simply raised the bar on the physics exam to make the failure rates consistent between both exams so they could point the finger at the residents and programs, which is exactly what they are doing.
 
Has anyone asked ABR to release the actual test and justifications for right vs wrong answers? I agree with ramsesthenice and many other senior members that the possibility of ABR agreeing to outright re-scoring or re-take is close to none. But I believe everyone who were negatively affected by this year's exam (residents, PDs, chairs, instructors) all deserve to see the actual content of the exam and determine it's legitimacy. Many on this thread complained about how poorly the test was worded and that matches how I felt with my own exam, fairly recent. And if a significant portion of this year's test questions were not bulletproof legit, it would only support your case for asking significant actions on this year's test takers, not just "we will provide better study material next year." Plus, it will certainly give individual PDs opportunity to cross-check the test with their lecture materials and display how "off" the test was compared to what was taught.

Hope all of you get something out of this painful process.
 
No, the change is now that there are 2 parts of section V.C.2.c)

Compare the new document:
https://acgme.org/Portals/0/PFAsset...ogyCore2019-TCC.pdf?ver=2018-06-18-083751-487

V.C.2.c).(1) Sixty percent of the a program’s graduates from the preceding five years taking the American Board of Radiology qualifying (written) examination for the first time must pass. (Outcome)

V.C.2.c).(2) Sixty percent of a program’s graduates from the preceding five years taking the American Board of Radiology certifying (oral) examination for the first time must pass. (Outcome)


To the old:
https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/430_radiation_oncology_2017-07-01.pdf

V.C.2.c).(1) Sixty percent of the program’s graduates from the preceding five years taking the American Board of Radiology certifying examination for the first time must pass. (Outcome)

The change is that not that the oral boards will no longer count. The change is that now the first time pass rate on both the oral and written boards will count when it used to be that only the oral boards counted.

How many program directors are aware of this change?

This is outrageous.
 
The change is that not that the oral boards will no longer count. The change is that now the first time pass rate on both the oral and written boards will count when it used to be that only the oral boards counted.

The written and oral boards always both counted. The new wording seems to just be a clarification. I missed the second section that specified the oral boards when I read it just now.
 
Well, maybe this is just a very messed up way to shut down residency programs to help limit the number of residents. I admit I’m not great on details.
 
The written and oral boards always both counted. The new wording seems to just be a clarification. I missed the second section that specified the oral boards when I read it just now.

Perhaps they both counted at some point in the distant past, but unless I'm missing something obvious, only the oral exam counts now. The current manual in effect does not mention the qualifying (written) exam anywhere and only mentions the certifying (oral) exam. The next revision going into effect next year has a clear addition of the written exam. Someone feel free to prove me wrong, but now in summary we have all these recent events:

1. A new ABR policy that does not officially publish pass rates for each individual year.
2. A radiation biology task force with findings concluding that instruction and exam content should change.
3. Editorials in the PRO suggesting a perception of decrease in trainee quality and poor instruction at smaller programs based on literally zero evidence.
4. No published ASTRO study guide for the first time in over a decade.
5. Anecdotal reports of recall use at (some) larger programs
6. Anecdotal evidence of dramatically surprising content and difficulty on the biology exam that did not align with the most common study materials.
7. Confirmed evidence of failure rates multiple standard deviations away from historical averages.
8. Emails from the ABR claiming that the exam was fair and did not change and that failures were due to poor instruction at smaller programs.
9. And now an ACGME policy shift authored literally days before this year's written exam to base program accreditation on the written exam.

You can draw your own conclusions.

The next event (10), I suspect will be a publication correlating failure rates on this year's exam to program size and translational research output.
 
The current manual in effect does not mention the qualifying (written) exam anywhere and only mentions the certifying (oral) exam. The next revision going into effect next year has a clear addition of the written exam.

OK. You're right. The written exams seem to not have counted against programs in the past. Only the oral exam. The written exams will count against programs going forward. It does seem to be a pathway to reducing residency spots, which this forum has advocated for.
 
We’ll never know but the breakdown of pass rates between institutions who had people involved with the exams vs those who did not would be verrry interesting


Sent from my iPhone using SDN mobile
 
Regrettably, the ACGME program requirements for postgraduate training in radiation oncology are quite vague as to specific curriculum content; thus, the material assessed by the ABR represents a modest baseline for assessment of clinical competency. Where possible, elements of all core competencies are included in examination content.
 
OK. You're right. The written exams seem to not have counted against programs in the past. Only the oral exam. The written exams will count against programs going forward. It does seem to be a pathway to reducing residency spots, which this forum has advocated for.
This is not quite true. I served 6 years on the RRC for Radiation Oncology. Annually the ABR provided 10 year reports for programs under consideration for continued accreditation. The methodology used required for an applicant to appear in both the numerator and denominator of the pass rate they were required to pass the three written exams AND the oral exam ON TIME. This meant that a candidate who did not pass any one of the tests ON TIME would appear in the numerator BUT NOT the denominator even is they subsequently passed the written test. If an applicant elected to defer the test for whatever reason (pregnancy, death in family, etc) then it counted against the program pass rate. In effect, written tests were included in the 60% metric. At that time (about 10 years ago) the number of programs that did not meet this criterion was less than five of the 70 or so programs that were open at the time (as the first-time pass rate was consistently about 90%). A small number of programs were given an adverse finding on accreditation based on this criterion alone. If overnight the first-time pass rate falls to 70% then this criterion will be invoked much more frequently.
 
Medgator may actually get his wish as the spots 5-10 years down the road go to IMGs, first time failure rates at programs skyrocket >50% due to actual poor resident quality, and programs get shut down. The remaining programs then dedicate an enormous amount of time and effort teaching biology and physics to the PhD level at the expense of clinical training lest they get shut down as well.
Not my wish, but the job market has to be addressed somehow as the current leadership in the field is content sticking their collective heads in the sand.

I merely pointed out that many of the 50+ year olds in this field got in during a time where this field used to routinely fill with IMGs and DOs. And ironically, many of those individuals are in power now, whether it be in academics, specialty leadership or large private practices. It is what it is.

Again, please stop derailing this thread into the job market as has been requested the mods and others already
 
Last edited:
Honestly the job market conversations should be kept to the many other threads addressing this topic, as mentioned before...

The pass rates for this year are quite telling though. Unfortunately I dont think the pass rates are so outrageous to demand any real recourse other than a possible early retake (doubtful). 26% fail is really really tough, but that means 74% of people did better than you if you failed. It may unfortunately have only been by a few questions and that really sucks, but bottom line the exam was just very very difficult this year. Its interesting that people aren’t complaining more about physics which was even lower, but I guess people are more used to that being a tougher exam with more variable pass rate.

I do take issue with the display of data from the passing standard and standard range. That data is almost entirely uninformative and potentially misleading, because it is totally subjective based on the expected that ‘would’ get a question right. You can have a bunch of rediculously hard questions that are rated as 80% ‘would’ get it right, and a bunch of more reasonable easy questions that are rated as 80% ‘would’ get it right. These different sets of questions would lead to the exact same passing standard, but totally different pass rates.

Nevertheless, the board put out a really hard and difficult set of exams this year, I feel for all those that didn’t pass. I know of stories of oral examiners being extremely tough and failing a bunch of attendings. Imagine that shock. At least if this is your first time failing physics/radbio writtens you will get another chance to pass next year and get back on track.
 
Honestly the job market conversations should be kept to the many other threads addressing this topic, as mentioned before...

The pass rates for this year are quite telling though. Unfortunately I dont think the pass rates are so outrageous to demand any real recourse other than a possible early retake (doubtful). 26% fail is really really tough, but that means 74% of people did better than you if you failed. It may unfortunately have only been by a few questions and that really sucks, but bottom line the exam was just very very difficult this year. Its interesting that people aren’t complaining more about physics which was even lower, but I guess people are more used to that being a tougher exam with more variable pass rate.

I do take issue with the display of data from the passing standard and standard range. That data is almost entirely uninformative and potentially misleading, because it is totally subjective based on the expected that ‘would’ get a question right. You can have a bunch of rediculously hard questions that are rated as 80% ‘would’ get it right, and a bunch of more reasonable easy questions that are rated as 80% ‘would’ get it right. These different sets of questions would lead to the exact same passing standard, but totally different pass rates.

Nevertheless, the board put out a really hard and difficult set of exams this year, I feel for all those that didn’t pass. I know of stories of oral examiners being extremely tough and failing a bunch of attendings. Imagine that shock. At least if this is your first time failing physics/radbio writtens you will get another chance to pass next year and get back on track.

Surprised at the lack of discourse over physics as well. For me the physics exam was tough but fair. The questions themselves were difficult and required some precise calculating, but except for a handful of questions the content is at least covered in Khan/Caggiano/old RAPHEX exams. My guess is that most examinees felt this was the case and concluded there wasn’t as much to complain about. Interested to know if others feel differently. Also we can’t discount that the passing standard was 4 points higher than average, which I’m sure could make the difference in a lot of scores.

Conversely ~40% of the radbio questions contained a protein, molecule, or pathway I never even heard of, nor could I find them in Hall/ASTRO guides after the test.

Agree that we can’t expect much of anything including early re-takes from the ABR, but disagree that the number isn’t outrageous. 70 and 74 means that like 40-50% of the class failed one or the other.

Also agree that publishing the passing standards and ranges is a silly way to justify everything. The passing standard is irrelevant when the questions change year to year, particularly if those questions come from resources most of us aren’t aware of, which they have basically admitted to.


Sent from my iPhone using SDN mobile
 
This is wild. The purpose of board examination is to demonstrate a satisfactory level of clinical competence for patient care, not to weed people out.

If you think this is bad, just wait until you go face-to-face with an oral board examiner who got mad at you because you didn’t quote his retrospective review that is currently in press.
 
Last edited:
Dear Radiation Oncology Department Chairs and Program Directors...

...One factor in the lower physics and radiobiology scores may be a lack of clarity regarding up-to-date reference study sources, and heterogeneity of teaching standards across programs. We are working with our committee chairs to develop a more useful study and reference guide for trainees and are encouraging stakeholders in the academic radiation oncology community to create greater consensus regarding curriculum content and teaching methodologies.

I nearly had a stroke when I was forwarded a copy of this "Dear Chairs and Program Directors" letter. Not only is the ABR insulting the intelligence of chairs and PDs by thinking that publishing their deceptive data would placate everybody, but also is continuing to state that there was nothing wrong with this year's exam, ergo that residents who took the exam must be stupider than in the past (puh-leeze!).

And now, it's the educators' faults too for their "heterogeneity of teaching standards" and "curriculum content and teaching methodologies", when I and others have been publishing about this very issue for nearly 20 years, and any efforts to better standardize curricula have been either roundly ignored by the ABR, if not actively discouraged? Seriously?

Unbelievable.
 
Top