Physics & Radbio

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
In their seminal paper addressing this very issue our radiation oncology ABR representatives Paul Wallner and Lisa Kachnic provided some insight into ABR rationale and mindset. Using and citing such sources as wikipedia they were able to prove that "regardless of a belief within the radiation oncology community, trends in the quality of residents accepted for training have been drifting slightly downward"

See: "Commentary on: Thoughts on the American Board of Radiology Examinations and the Resident Experience in Radiation Oncology"

They were challenged on this forum with data. PRO was embarrassed for publishing such mindless garbage but it did provide some great insight into the disconnect between the old guards and reality. Fast forward 4 months and now the ABR has some real data that resident quality has declined. Kids these days do not know their radiation biology. Now they can justify the existence of this legal extortion. The ABR and Paul Wallner get to decide career fate. His $266,053 part time salary and benefit package (2016 ABR form 990) from the ABR are safe. They have decided that mindless biology trivia is how residents should spend their time.

The elusive failure rate this year is arbitrary. What people should be most upset about is the opportunity cost. Even if someone passed they still failed because they spent time learning material that is not relevant to clinical radiation oncology. Perhaps our specialty could create a relevant board exam. Perhaps radiation oncologists (not radiologists) born in the last 40 years could help decide what is necessary for competence today. Other specialties have successfully challenged their out of touch boards. Perhaps we should consider.
 
Lol funny. The fact is most attendings who arent recent grads could not even sniff an interview these days if they were applying nowadays in most programs.
 
Lol funny. The fact is most attendings who arent recent grads could not even sniff an interview these days if they were applying nowadays in most programs.

No need to if they own their own practice and paying new grads 200k a year to do all of their work.
 
@RadOncDoc21 - would love to know if even one case of that happening exists. just one.

some of you don't realize that there are people who read this forum that actually want to know what's going on in the rad onc world, this isn't your emo livejournal.
 
No need to if they own their own practice and paying new grads 200k a year to do all of their work.
Yup
@RadOncDoc21 - would love to know if even one case of that happening exists. just one.

some of you don't realize that there are people who read this forum that actually want to know what's going on in the rad onc world, this isn't your emo livejournal.

You don't think exploitation happens in RO? There were stories 5-10 years ago when I was job hunting of all of those notorious practices that would hire new grads for cheap labor and cut them loose 2-3 years later before they could partner in. FL was notorious and the smart grads knew which practices to avoid. I've heard the same in tx ga and ca

I bet the current job market has made such practices even more rampant, if anything.
 
Last edited:
I took rad bio last year and it was a colossal waste of time. I'm genuinely surprised that some people on this forum think memorizing an alphabet soup of proteins in various cell signaling pathways is a good use of time. I personally think even a 5% failure rate is unacceptable given the high caliber of rad onc residents let alone a 20-25% failure rate?! When I matched, the average rad onc step I score was almost a standard deviation above the national average. And why do we even have 3 board exams? One written and one oral exam should be more than sufficient for board certification.
 
Who is actually responsible for the radbio test? There must be a committee no?
 
Third, the Angoff standard setting procedure employed for determining the scoring standards of the ABR exams, as alluded to but not described, is carried out annually for each individual exam instrument. This process is conducted by physicians actively practicing in academic and private practice settings and working with trainees who will actually take the exams. The essence of the Angoff standard setting procedure is to opine the number of individuals who would answer a specific question correctly (not the number who should answer it correctly). Every item (question) is judged individually on this basis, with no pre-ordained intent for outcomes. Therefore, the scoring "target" moves, in any given year, based on this judgment in relation to the difficulty of the unique set of items on a particular exam. The Angoff scores for individual items are then totaled to form an average percentage of questions that would be answered correctly for the entire exam (2). The final pass-fail score cut is thus a criterion-based reference, which makes it possible for all examinees to pass the exam if they meet or exceed the standard for a given set of questions. To support this element of their hypothesis, the authors refer to a manuscript by Becker et al (4) describing the process, policies, and background of ABR exams, and interpret a statement regarding a responsibility to maintain the public trust as an ABR-established policy regarding maintaining a ≈10% failure rate. In fact, that publication makes no such linkage of the issues, and the only mention of a ≈10% failure rate is a response to a hypothetical question posed by an outsider. The ABR has never set an arbitrary or norm-referenced passing standard for written or oral examinations and has no intention of doing so.

Fourth, generally available National Residency Match Program (NRMP) data suggest that over the past decade, regardless of a belief within the radiation oncology community, trends in the quality of residents accepted for training have been drifting slightly downward (4).

This is from the PRO commentary referenced above and previously posted on this forum, authored by Paul Wallner, DO.

I find it absolutely galling, having seen the content of this year's rad bio test and enormous disconnect from ASTRO study guides, that senior leaders may use the rad bio results to support a completely unjustified and biased conclusion that recent grads are less academically qualified than in decades past (you know, when radiation oncology was one of the least competitive specialties in medicine, only 3 years in length, step 1 was basically viewed as a pass/fail test and nobody remembered their score, and it was not uncommon to fill positions with foreign grads and DOs who would never be offered an interview now). The mean USMLE step 1 for matched applicants was 247 in 2018 -- it was 234 in 2008. Clearly we are drifting down, albeit "slightly." Although, to their credit, they do mention that "the material tested on the USMLE examinations is unrelated to the material tested on the ABR exams," so obviously we don't care about USMLE scores as a field for selecting applicants (which I think we all know - no med student applying to radiation oncology cares about his step 1 score because it doesn't predict success in radiation oncology!). So what is this other metric they are using to conclude that our quality is drifting down?

I would love to see evidence of the claim that academic and private practice physicians were involved in the Angoff standard setting procedure (which btw was literally referenced back to Wikipedia), because they don't seem to be on the rad bio committee unless I'm missing something. And I also learned something. I thought we were graded based on what we "should know." Apparently not. The cut score is determine by what they think we "would know." Thinking that we "would know" the trivialities on that test is even more absurd than thinking that we "should know" them, especially considering the large number of people with second quartile scores who apparently did not know them, they weren't taught to us, there was no warning we would be required to rote memorize these obscure pathways and three-letter-acronym trivia, and there was no published ASTRO study guide this year. Did they really think we "should know" these things just by nature of where we are in our training? Having such a large number of people fail what they think we "would know" compared to prior years should be evidence enough in-and-of-itself that the test and evaluation methods were flawed and should be invalidated.

What else do they think we "should know?" Who knows!

It is becoming crystal clear to me there was an ulterior motive for clandestinely altering test content to not align with study guides and texts and reporting of scores and that the Angoff method is utter crap that can easily be used to increase the fail rate if there is either a conscious or subconscious desire (e.g., if there were perhaps this notion out there that current rad onc residents are more stupid and lazy) to do so among those involved in the process.

The retest fee is $640. I'm assuming you have to pay it twice if you failed both. I wouldn't be surprised if they have >$50,000 from retests alone next year. I would be surprised if Pearson VUE gets anything other than small fraction of that.
 
This is from the PRO commentary referenced above and previously posted on this forum, authored by Paul Wallner, DO.

I find it absolutely galling, having seen the content of this year's rad bio test and enormous disconnect from ASTRO study guides, that senior leaders may use the rad bio results to support a completely unjustified and biased conclusion that recent grads are less academically qualified than in decades past (you know, when radiation oncology was one of the least competitive specialties in medicine, only 3 years in length, step 1 was basically viewed as a pass/fail test and nobody remembered their score, and it was not uncommon to fill positions with foreign grads and DOs who would never be offered an interview now). The mean USMLE step 1 for matched applicants was 247 in 2018 -- it was 234 in 2008. Clearly we are drifting down, albeit "slightly." Although, to their credit, they do mention that "the material tested on the USMLE examinations is unrelated to the material tested on the ABR exams," so obviously we don't care about USMLE scores as a field for selecting applicants (which I think we all know - no med student applying to radiation oncology cares about his step 1 score because it doesn't predict success in radiation oncology!). So what is this other metric they are using to conclude that our quality is drifting down?

I would love to see evidence of the claim that academic and private practice physicians were involved in the Angoff standard setting procedure (which btw was literally referenced back to Wikipedia), because they don't seem to be on the rad bio committee unless I'm missing something. And I also learned something. I thought we were graded based on what we "should know." Apparently not. The cut score is determine by what they think we "would know." Thinking that we "would know" the trivialities on that test is even more absurd than thinking that we "should know" them, especially considering the large number of people with second quartile scores who apparently did not know them, they weren't taught to us, there was no warning we would be required to rote memorize these obscure pathways and three-letter-acronym trivia, and there was no published ASTRO study guide this year. Did they really think we "should know" these things just by nature of where we are in our training? Having such a large number of people fail what they think we "would know" compared to prior years should be evidence enough in-and-of-itself that the test and evaluation methods were flawed and should be invalidated.

What else do they think we "should know?" Who knows!

It is becoming crystal clear to me there was an ulterior motive for clandestinely altering test content to not align with study guides and texts and reporting of scores and that the Angoff method is utter crap that can easily be used to increase the fail rate if there is either a conscious or subconscious desire (e.g., if there were perhaps this notion out there that current rad onc residents are more stupid and lazy) to do so among those involved in the process.

The retest fee is $640. I'm assuming you have to pay it twice if you failed both. I wouldn't be surprised if they have >$50,000 from retests alone next year. I would be surprised if Pearson VUE gets anything other than small fraction of that.

So I finally heard back from the ABR. They denied both my request for a hand graded exam (which per the website I thought they had to honor?) and for a mid year retake. The denial of a mid year retake is what I find the most frustrating....I don’t want to drag this process out another year! Do you think our collective efforts with ARRO will make any headway towards a mid year re-exam?
 
No.

So I finally heard back from the ABR. They denied both my request for a hand graded exam (which per the website I thought they had to honor?) and for a mid year retake. The denial of a mid year retake is what I find the most frustrating....I don’t want to drag this process out another year! Do you think our collective efforts with ARRO will make any headway towards a mid year re-exam?
 
I feel for all affected by this and am now terrified of rad bio and physics (which I will be taking at end of this academic year). Hopefully ARRO can step up and address this in some fashion for transparency from the ABR.
 
I think figuring out what the pass rate for each exam was, is a better goal. I don't think the ABR will release this, as they got heat in 2014 when it was 80% for physics. I do think program directors can figure this out by informally surveying each other.

The question is what to do about a low pass rate. I mean a 25% failure rate for each exam, which it seems to be, is sad for the whole field for all the reasons RadOncG outlined above.


Two things:

1) Some Program Director that is reading this (I'm sure many have been told about this thread by now) should definitely do some type of poll either informally or through the ADROP to determine what the pass rate actually was.

2) It seems like rad bio fail rate was probably abnormally high, but do we have evidence necessarily that the physics fail rate was higher than norm?
 
I'm still trying to find out more info about physics, but the failure rate seems high there as well. Some things I know for sure:

1. A score of 3,3,2,2,1 led to a failure. As did my 3,3,2,1,1

2. At one institution 2 of 3 residents failed physics. From the sound of things having anyone fail was odd for them, let alone two people.





Two things:

1) Some Program Director that is reading this (I'm sure many have been told about this thread by now) should definitely do some type of poll either informally or through the ADROP to determine what the pass rate actually was.

2) It seems like rad bio fail rate was probably abnormally high, but do we have evidence necessarily that the physics fail rate was higher than norm?
 
Asking for a re-grade by hand is not going to be fruitful. Even if they were to grant it, the problem was with the questions and "would know" angoff b.s. criterion setting the bar ridiculously high. It's virtually impossible for there to be an isolated data corruption event that recorded the answer wrong.

Agree with javk that our efforts are best focused on continuing to demand more info about the test including pass rates and their rationale for trying to make the test harder and increase fail rates.

Did they really take heat in 2014 for the 80% fail rate? How? What was the outcome?

If it's true ADROP also wanted to increase difficulty, unlikely they will be any help with a survey to go behind ABR's back and figure out the fail rate. We could do an informal poll here, but results would be unreliable -- even with >50% responding people could look at it and say the rate was too high or too low depending on what bias you want to focus on. The only real option would be in ARRO's hands with a centralized way to contact everyone and verify responses came from those they were sent to and anonymized. I wouldn't hold my breath on that one.

While bio was the more ridiculous test, I think it's pretty clear the fail rate on physics was high too.
So what were the numbers, ABR? Why are you hiding them? Why did you raise the bar and change the questions? What's your rationale? What are you all talking about behind closed doors?

I really feel for those unlucky souls who failed for a second time this year. Sounds like the test started to change last year, and then you have to come back and repeat it and that happens on a year when everybody was hit out of nowhere with that ugly monster in July. Really, really sucks. While it would be nice to get some answers from the ABR, I don't think we can count on it. We need to continue to look out for ourselves collectively as residents, spread the word that the ABR is out to get us, try to somehow fix in-program curriculum to match whatever the hell that was we saw in July, and start making a study plan for the next year now -- I certainly have.

It is pathetic that we have to waste our last year in residency trying to memorize literally every molecular pathway and enzyme in the literature to the finest detail at the expense of time we should be spending focusing on consolidating essential clinical knowledge and preparing for independent practice. All because the ABR is either unable or unwilling to properly determine what we as competently trained PGY-4s "would know" if they pulled us out of clinic one day and asked us about radiobiology (but certainly not what we "should know" because that would be wrong!!! -- I still can't get over that). They fail to understand or care about the difference between true knowledge (retained conceptual understanding of fundamentals) and binge-purge crammed factoids for exams. The latter is used for weed-out purposes. To say that we "would know" that is simply a lie. The exam prep cottage industry is cheering.

If there was any remaining doubt, radiation oncology has officially joined that sad list of professions that eat their young.
 
Yes, I fully realize that asking for a re-grade or a mid-year exam is a grasping at straws maneuver, but in all honesty I feel desperate to do something to get back on track after failing twice.

Here is the text of the email reply I got from Dr. Wallner himself (likely canned):

Your query to the ABR office regarding your performance on the recent physics and cancer and radiation biology qualifying examinations has been referred to me for response. I can understand your disappointment with your performance and certainly regret the outcomes, but I can assure you that the scoring was fair and unchanged from previous years. The ABR uses a criterion-referenced standard similar to almost all high-stakes professional testing organizations. The physics and cancer and radiation biology standard is set by panels that include physicians both in academic and private practice, and at various career stages.

I can understand your concern regarding the examination, but a significant majority of test takers did score above the cut score and passed all sections of the exams.

There is no appeal process for failure, as all grades are checked and re-checked multiple times before publication.

We have had requests for additional guidance regarding possible study references, and working with the various committee chairs, that material will be published in the near future. Performance results for the entire group of test takers will be published on the ABR website in the near future.

There will not be another exam date this year. You will receive an invitation for next year’s exam 3 months prior to the exam.
 
RO28, thank you for posting this.

I can't believe what I just read:

"The physics and cancer and radiation biology standard is set by panels that include physicians both in academic and private practice, and at various career stages."

Q: Since when did this evolve from the physics and radiobiology exam to the physics and cancer AND radiation biology exam?
A: This past July. Without our knowledge.

We walked into an exam that we thought was a radiation biology exam. What we found was a cancer biology exam. I don't know about the rest of you, but my program taught me classical radiobiology, not modern advances in cancer biology.

He just admitted it in his own words. And then went on, of course, to patronizing you. "Significant majority?" Quantify that.

This whole thing stinks so much.
We got screwed by a surprise change in the test. They know it. Lets hope they deliver on their promise to tell us what we need to study to pass.
 
I don't think anyone has mentioned this yet but PearsonVue is also fraught with problems. They have been challenged tons of times for errors and problems with their testing, resulting in measures ranging from class action lawsuits to entire states firing them from administering the standardized tests for the state. They make a lot of money from us having to retake exams.
 
Could someone please explain to me (simply) how the Angoff Scoring Method works? I am under the impression that each question which is created is reviewed by experts in the field to predict how many qualified candidates would answer the question correctly. And from this a 'cut off' score is created. My real question is... if the exam this year had new questions then the ABR set a cut-off based on how many people they thought should get these new questions right (if they were qualified). If for some reason a HIGHER THAN NORMAL number of individuals failed this year, then maybe they were off with their assessment of these questions? If someone could clarify I would appreciate it, because the ABR keeps essentially saying there is no way the Angoff method could have been wrong (essentially implying that the issue lies with the residents not being as intelligent or as prepared for the exam). However, in the article he wrote in PRO he specifically says:

'The essence of the Angoff standard setting procedure is to opine the number of individuals who would answer a specific question correctly (not the number who shoulud answer it correctly). Every item (question) is judged individually on this basis, with no preordained intent for outcomes. Therefore, the scoring "target" moves, in any given year, based on this judgement in relation to the difficulty of the unique set of items on a particular exam.'

Couldn't this mean that they were just off this year with their judgement on the difficulty of the questions on the exam?


I emailed Dr. Wallner directly asking what the scores were on Rad Bio and Physics this year. I stated the previous pass rates from 2014-2016 (92% for physics) and the data we uncovered from 2004-2015 (88.9% for physics). I asked him specifically what the scores were for this year and when they would be releasing them. I have copied the response I received below:

'We can understand your dissatisfaction with your performance in the exams this year. Overall results will likely be published in the near future and we are working with the physics and biology committee chairs to develop improved source material and study guides.

Additional “clarification” would be of little benefit since the exam development and scoring process did not change. The only change was the lower pass rates for physics and biology, which we believe may be significantly related to factors outside of the control or prevue of the ABR.'


I don't want to add fuel to the fire and I don't want to read into this reply too much. However, this does seem to admit that there were lower pass rates for physics and biology this year and that they believe it was due to factors outside of them (the ABR), nothing to do with their test.

It wouldn't be right for us to just call the ABR the villain in this situation. If they provide us with information saying that the pass rates were unchanged from prior years then there isn't much more to say. However, I don't want us to be the victims in this either. If failure rates were signficantly higher, then COLLECTIVELY (ABR, ARRO, ADROP.. whoever needs to be involved) should be reviewing this issue and finding a fair resolution. It shouldn't just mean that we all suffer because of their mistake.
 
Last edited:
His statements are incongruous.

He is adamant that the "exam development and scoring process did not change," but he then says that the lower pass rates must be due to something else (i.e., lower resident quality or poorer teaching from the academics I guess -- which is what we have been hypothesizing as their motive).

It is overwhelmingly obvious that something DID change with the ABR's bio committee. Sure, you can say the scoring process didn't change because you used the "Angoff method" again. But by nature of this method, there is nothing stopping the content writers from deciding on their own that residents "would know" highly detailed cancer biology that is not taught in the traditional radiobiology curriculum. If you want to change the emphasis and content of the test, fine. But this is not the correct way to do it - to slap a quarter or more of trainees with a board failure scarlet letter and waste their time and money. It is ridiculous to out of the blue decide that residents "would know" this material by PGY-5 without communicating this to programs years in advance. On this thread we have a radiobiology educator admitting to being left in the dark.

Additionally, there was no ASTRO study guide this year for the first time in over a decade. How is that not a change? By his own admission, there was a lack of instruction available for adequate preparation. No indication this was now a predominant cancer and cellular biology exam.

It's disgusting how they are trying to use technicalities to wash their hands of this mess rather than conduct a proper investigation. If they want to make the claim that the lower pass rates were related to factors out of their control, they are going to have to prove that. Without this proof they need to admit their content writers erred when they decided what residents "would know" (the implication is that the Angoff method is not susceptible to this kind of error or bias from the "experts," which I think we all can see is ridiculous). If their system is so correct and proper as they say, then they should be able to bring in third party experts and repeat the Angoff method and have the cut score be reproducible. If there was a major drop in rad bio not consistent with the 92% mean with 4% standard deviation (e.g., if it were something like 70% which is the rumor), then I think they would rightly have to reassess the validity of their test by proving the cut score was reproducible. Without doing this, it is presumptuous, to say the least, to imply that dramatically lower scores are simply the result of lower resident quality.
 
It looks like he is saying that the exam did not really change from prior years, and that the only change was the lower pass rate. So he is acknowledging a lower pass rate, which he attributes to "outside" factors. Without directly saying it, he is saying that it was the examinees' failure and not the ABR's. Unfortunately, there is probably no way to determine if the exam has or has not changed, but at a minimum (as others have said) the ABR should provide a guide to the content of the exam (as they should have done in past years).
 
radiaterMike -- Correct, because the exam content is not available for public review, they can only ask us to take their word that the content did not change on faith. I think most of us were very surprised by the amount of molecular biology minutae on the exam, and I don't see many reports from prior years about this concern. And those of us can look at prior ASTRO study guides, see that that stuff wasn't there, correlate with historical failure rates, and really really have to question the claim that "the exam didn't change." Without empirical evidence that resident aptitude was dramatically lower for the class of 2019, it is irresponsible of them to simply ignore a failure rate 4-5 standard deviations away from the historical average. That would indicate a problem on their end, no? 1-2 standard deviations ok. 4-5 or more? Somebody sucks at stats. We will have to wait. I think we have called them out enough for trying to hide the data. They need to publish it and we can go from there.
 
I think there needs to be a concerted effort on the part of all program directors, as well as ARRO, to get some answers, because it is really not fair to have so many more ppl fail this year, which is obviously not as simple as "don't worry, you can just take it next year" but entails spending tons more time studying physics and bio again, time that should be spent on more relevant things to our career. We also need to take clinical writtens, and having to take all 3, instead of just the clinical exam since these ppl should have really passed the physics and bio which they would have done in previous years, that probably hurts ppls chances on the clinical exam!
 
I do feel for you guys. This sucks. I would have a lot of frustration to vent if I were in this situation as well. I think the ABR dropped the ball. I am skeptical of the nefarious conspiracy theories being floated here. I think the quality of their exams has been getting worse over the last few years. For years I heard the major difference between ABR and in-service exams was that ABR exam questions are heavily vetted. Not so on the exams I took. In addition to minutia, there were also just bad questions. I remember one on the clinical writtens asking if the 5 year DFS for X cancer was 8, 10, 16, or 25% according to X study. That is ridiculous. Especially since three similar RCTs have been done on the topic and all of them fell within the range of the first 3 options. Phrasing a question this way implies that one of them is the "true" answer when in fact 75% are reasonable based on large, well-done RCTs.

In regards to rad bio, I have a few comments and suggestions based on my experience with the exam and that of several colleagues and residents who have taken it over the last few years. I recall feeling very fortunate that I had a PhD in pharmacology and cell biology because there were a number of detailed questions that I would probably never have come across otherwise. Things like where is X protein located in the cell. I am not here to judge the value in these kinds of questions. But the reality is they are here and you have to adapt to them if you want to do well.

So what are we to do as educators and test takers? I would suggest that the specific proteins and pathways being tested are not as random as they seem at first blush. Through my own experiences, word of mouth, and other forums I have come across specific recalls from recent tests (I specifically deny actively seeking them out and I have not shared specifics of said information). Virtually all of these seemingly random proteins are directly related to something being evaluated clinically in early phase trials. The specific questions are actually very straightforward. The main reason they are difficult is that a lot of these agents/trials are not specific to radiation and many residents have never heard of them. The questions might as well be written in Russian for those poor souls.

At our program, we are going to start more carefully reviewing classes of small-molecule inhibitors making it into clinical trials and make sure our residents have at least heard of them and have a big picture understanding of the general pathways involved. If I were preparing to take the boards next spring, I would peruse abstracts from ASCO and ASTRO and take note of the different classes of targeted therapies being tested in clinic and have some peripheral understanding of how they work. Its not practical to expect can or should familiarize yourself with all of them. But if you have at least seen them you can hopeful move from a complete blind guess to a 50/50 shot.
 
As one of the first people to post in this thread after learning my fate, I just want to thank everyone for the discussion. Now that it's been a few more days, I'd like to share my thoughts. Feel free to disagree, I'm just one resident...

1) We need to learn the pass rate for each exam at a minimum. If the pass rates were close to historical averages, then we need to enjoy our serving of humble pie (in addition to an updated rad bio study guide). If it was very different, then we deserve more answers. As another poster mentioned, it is unlikely that the quality of the trainees just drastically dropped one year. It would be nice if we could also learn what qualifies as a 'pass'. Is there a mandatory minimum score in each section? Can a poor performance in a single section be the cause of a fail?

2) I'm skeptical about the ABR, but I am also worried that as residents, we may be undermining our own efforts through some of the language we're using. I've sat on enough committees in medical school and residency to know that older physicians enjoy making cracks about "entitled millennials", so I think it's on us to remain professional with relatively straight-forward questions (e.g. what were the pass rates?) and requests (e.g. provide us with updated study guides) moving forward. Until we have more info on the pass rates, I would kindly suggest we hold off on discussions of starting our own board and/or suing the ABR. Just my $0.02.

3) I'm still not sure how to study for next year. I took the advice of other people on this forum and my co-residents in designing my study plan, and was scoring well on Raphex (>75%). As for Rad Bio, I reviewed study guides going back 10 years and read Hall. I worry about a repeat performance in July 2019 if I don't change something up, but i'm not sure how to change it up. Maryland course? Osler course? Commit Hall to memory?

4) Everything will (probably) work out. If you look at the PRO article in which Amdur and Lee review pass rates, it looks like there is a see-sawing quality to scores year to year. A low-scoring year is followed by a higher scoring year. Does this make sense? No. Should we rest on our laurels? No. But I sincerely doubt there is some cohort of non-board certified radiation oncologists who somehow couldn't hack it due to rad bio and physics, and I don't think that will be our fate, either. It's just a matter of spending more time, energy, and money than any of us anticipated. This sucks, but we'll get through it.
 
I appreciate your post and thank you for starting the conversation. I will add, though, that if the Pass rates were indeed very different from previous years oh, it doesn't just suck to have failed but it is also a truly unfair situation that will negatively impact the remainder of our time as residents. If people who didn't pass but would have in previous years now need to spend a significant amount of time reviewing biology and physics, that is time that would be well spent studying clinical radiation oncology and growing in ways besides a re-review of physics and biology. As we all know, studying for these exams takes a lot of time and effort and to have to unfairly do that again is a serious problem. Again, this is assuming that the fail rates were substantially higher this year compared to other years. I agree that I don't imagine there is a long-term effect on our futures, but certainly there would be a short-term effect on the quality and time value of an entire final year of residency. If for some reason something was different this year, maybe poor question choices or insufficient study material provided by the ABR, then nobody here on this list is at fault and we shouldn't be penalized as a result.
 
And btw, I have heard of program directors reaching out directly to the ABR which I think is a good concerted effort. I would encourage anyone who was unhappy with the outcome of their test to have their program directors similarly get in touch with the ABR so that they take these concerns seriously
 
This concept of programs instructing students with exam recalls is outrageous. Really. Per documents available on ABR's own website ~40% of exam questions are recycled from previous years. It would therefore not be surprising that a program with an explicit policy of having residents collect recall items and either formally or informally pass them on to the next year creating a database with a decade or more of exam recalls could lead to residents passing the test simply by memorizing specific questions and having no idea about the fundamental concepts the rest of us studied.

The end result is a test that has become extraordinarily difficult to pass without cheating or a PhD in molecular biology.
They have lost any semblance of what we "would know" because what we "would know" or "should know" is totally corrupted by the percentage of students studying with large databases of recalled questions, which, by the ABR's own admission are recycled. As a result, the people with the recalls miss the new questions but continue to make the cut score because they are acing enough of the recycled ones to pass.

The ABR has already been burned on this on the diagnostic radiology side.
Now it seems like radiation oncology could be the new scandal.

Good work ABR. Investigative journalists are being contacted.
 
Long time lurker; grateful for the discussion on this thread and on this forum over many years.

We've seen a couple of strategies that have and have not worked (+/- the random chance of it all) and I wanted to share mine for next year's test takers. I do not have a background in Biology or Physics and passed both. My program does have dedicated annual courses on both topics. To the best of my knowledge none of our faculty are test writers for these two exams. We don't use recalls and I haven't known our seniors to sit people down to go over or recapitulate test questions even casually. I did hear from many of them though that last year's tests were very easy and was anticipating this year's exam to be more difficult. As for the seesaw swings in the pass rates in these exams. I think part of it might be that people who do not pass who come back and retake might have a higher chance of passing, so every other year's pass rate is somewhat "inflated" as there are second time test takers.

1. Rad Bio: To me the questions on this exam were of 2 varieties--immediate obvious or esoteric and unknowable unless you happened to have read the trial that had a radiobiological correlate or something of that nature. The immediately obvious ones were out of the Hall book or similar in concept to what was reviewed in the study guide, and could be answered within a few seconds. I'd say a slight majority of the exam content was covered by the widely used study materials. The remaining were as other people have said about random factoids. I surmise that if you got every single of the "immediately obvious" questions correct that you had a decent chance of passing, and I only say that because I don't remember encountering a question where some sort of special knowledge or expertise saved the day for me. By the end of my studying I was scoring in the high 80s to low 90s in percent correct on the study guide (many with repeated questions).

2. Physics: This felt like a more fair exam to me and was similar to Raphex with a few trick questions thrown in that I had to guess on. I used RAPHEX, the Chang book and McDermott. Did not read Khan except to try to remember some of the finer points about QA. I actually felt decently prepared for physics and passed comfortably. By the end I was scoring in the high 80s on my RAPHEX's.

I commiserate with many of you who are wondering how you are going to study for this a second time, because to be honest, if I had not passed, I'm not sure what other materials I would have used, particularly for biology as I felt that I had a pretty solid grasp of radiation biology. I think asking residents to review ASCO/ASTRO abstracts is unreasonable; these aren't even peer-reviewed to be honest and should not be making an entrance into standardized testing. In any case maybe you'd spend a whole year doing this, actually remember, and maybe get just a couple of questions right that you might have missed otherwise.

Good luck to everyone. I am anticipating and hoping that next year's examination will be easier.
 
I certainly hope that the opportunity to address the larger points is not lost here. Do we really need three separate board exams AND orals? what exactly does one need to know to be a competent CLINICAL radiation oncologist? Some of the views that have been expressed are silly. People either double down on nonsense invalidating anything they say and some already negotiating with themselves who triangulate to a useless achieve nothing about the big picture moderate view of being happy to just be able to pass what they know are already useless tests once again and moving on because it is no longer their problem.

The individual responding to emails is telling you exacly who is, a patronizing out of touch individual who will never do the right thing. I dont get the reverence towards these “experts”. The answer for situations like this is to clean shop. Its time to stop being at the mercy of baby boomers. These peoole will never be on our side. Make rad onc great again.
 
I think asking residents to review ASCO/ASTRO abstracts is unreasonable; these aren't even peer-reviewed to be honest and should not be making an entrance into standardized testing.

The whole reason this thread exists is because this ship has already sailed. It is not reasonable to change the formula and expectations without letting educators and test takers know but it looks like that is exactly what happened. I took it several years ago and can guarantee there were more than a handful of questions related to targets and drugs that have not made it beyond phase1/2 testing.

Look, I’m in the dark like the rest of us as to what the best thing to do is. But if your hovering at the cut line a couple questions can be the difference.
 
Totally agree. I think it's very reasonable to be familiar with the big basic science abstracts from ASTRO and ASCO as well as understand all of the hot new systemic agents, namely targeted agents and their pathways. I disagree that if a question was unknowable then it was "immediately obvious." There were lots of questions that were testing a basic concept but had some weird twist or were otherwise poorly written that made you doubt yourself (don't know if this was intentional or not). It's not enough for this exam to say that you are just going to be an expert at Hall and not worry about the low yield studying to try and get you the unknowable stuff (well, unless you're from one of these so-called recall programs). You could probably pass this way, but there was so much "out there" stuff on the test that it leaves little room for error (and I'd wager there were a LOT of people who passed just barely above the cut score).

Bottom line, while we can and should continue to fight the good fight here and offline, we can't count on the test being easier/more relevant next year and have no choice but to go forward and need to dedicate extensive time to memorizing the low-yield out there stuff including the molecular bio "alphabet soup" and the myriad new systemic agents. Since the exam is apparently scored based on "expert" opinion of what we "would know" they should take into account the stupid number of failures this year causing a more intensive study effort, thereby raising the bar of what we "would know" (can't we see how stupid this "would know" "should know" stuff is yet?). If the ABR virgins want to roll the dice next year and study for 2 weeks before the exam with Hall and the guides, go for it. Sadly the stakes are much higher for me 🙁
 
Totally agree. I think it's very reasonable to be familiar with the big basic science abstracts from ASTRO and ASCO as well as understand all of the hot new systemic agents, namely targeted agents and their pathways. I disagree that if a question was unknowable then it was "immediately obvious." There were lots of questions that were testing a basic concept but had some weird twist or were otherwise poorly written that made you doubt yourself (don't know if this was intentional or not). It's not enough for this exam to say that you are just going to be an expert at Hall and not worry about the low yield studying to try and get you the unknowable stuff (well, unless you're from one of these so-called recall programs). You could probably pass this way, but there was so much "out there" stuff on the test that it leaves little room for error (and I'd wager there were a LOT of people who passed just barely above the cut score).

Bottom line, while we can and should continue to fight the good fight here and offline, we can't count on the test being easier/more relevant next year and have no choice but to go forward and need to dedicate extensive time to memorizing the low-yield out there stuff including the molecular bio "alphabet soup" and the myriad new systemic agents. Since the exam is apparently scored based on "expert" opinion of what we "would know" they should take into account the stupid number of failures this year causing a more intensive study effort, thereby raising the bar of what we "would know" (can't we see how stupid this "would know" "should know" stuff is yet?). If the ABR virgins want to roll the dice next year and study for 2 weeks before the exam with Hall and the guides, go for it. Sadly the stakes are much higher for me 🙁

I agree, we should keep fighting the good fight. It is not going to change any other way and like many I have to take this again next year. I’m tired of being screwed by baby boomers harkening back to the good ole days and ready for change. It’s takjng all of my willpower not to send the strongly worded email that I wrote to the old white guy in charge. I would be willing to stick my neck out, unanonymozed, and write a PRO editorial if that would help the effort (since my job prospects are not on the line).
 
I actually did send him an email, in response to the one I got from him through the ABR contact email. I expressed mine and others' surprise and dissapointment, how the answer does not lay in it being a year of less qualified residents, and the ABR needs to look at this very carefully and do something about it. Just saying oh don't worry you can take it next year is not a solution. In our last year of residency, we should not have to wrongfully re-dedicate all that time and effort to studying bio and physics again, when this is actually our last chance to learn everything we can about clinical radiation oncology while we are atill residents. I even made some suggestions on what could be done, in short, including conceding that it was a bad test this year and just passing everyone, holding a mini makeup test halfway through the year so we don't have weighing on us all year and getting in the way of clinical boards studying, or even offering a course/module that we have to take that has quizes or assessments so that they can be confident we know what we should know without having to study all over again for the tests.
I encourage everyone to, in a respectful way of course, write to the ABR and Dr. Wallner, and importantly have your program director email them. Mine already did.
 
I actually did send him an email, in response to the one I got from him through the ABR contact email. I expressed mine and others' surprise and dissapointment, how the answer does not lay in it being a year of less qualified residents, and the ABR needs to look at this very carefully and do something about it. Just saying oh don't worry you can take it next year is not a solution. In our last year of residency, we should not have to wrongfully re-dedicate all that time and effort to studying bio and physics again, when this is actually our last chance to learn everything we can about clinical radiation oncology while we are atill residents. I even made some suggestions on what could be done, in short, including conceding that it was a bad test this year and just passing everyone, holding a mini makeup test halfway through the year so we don't have weighing on us all year and getting in the way of clinical boards studying, or even offering a course/module that we have to take that has quizes or assessments so that they can be confident we know what we should know without having to study all over again for the tests.
I encourage everyone to, in a respectful way of course, write to the ABR and Dr. Wallner, and importantly have your program director email them. Mine already did.

I like all those ideas, I’ll send an email reiterating those suggestions today too.
 
To get answers from the ABR you have to think like the ABR. Jump into Drs. Wallner and Kachnic's mindset. At the slightest bit of public questioning by Amdur and Lee they wrote a word salad of a mostly incomprehensible rebuttal (Have read it several times- actually still have no idea what they are saying). So their currency is justifying the ABRs existence and they are hyper sensitive of public criticism.

So with that in mind how do you get answers? You write about your dreadful experience with the exam- how you felt when you saw those questions out of left field, how it impacted you personally and professionally, the cost of retake, the lack of answers from the ABR, missing opportunities for learning what matters, missing family time etc. Then you send it to KevinMD, doximity, zdoggmd, and others for publication. These sites live off content that physicians produce. If you have seen these sites they thrive off Doctors versus The Man stories. This story is perfect. Then sit back and watch the ABR start to release statements, failure rates, and how they will address the situation (see wikipedia-public relations 🙂 )

I did not personally take the exam this year. I am certainly sympathetic with those that did and happy to help with strategy and writing. Would work best with several people involved. PM me if interested.
 
I agree that a robust effort should be made. I wrote to Dr. Wallner and we have been going back and forth. Essentially, he is justifying the Angoff statistical method as infallible, saying that the questions come from real doctors so they are certainly appropriate and relevant and they are looking into what happened. He did say they are working with subcommittees of the ACGME to improve study material and curricula moving forward.
What he DID NOT address, which I am continuing to press him on, is beyond what is done in the future, what are we doing about THIS years test. We want to see improvement for the next year and every year after, but our primary concern is what seems like an unfair experience this year. If there is a large percentage of ppl who failed who did not deserve to fail, they should not be condemned to an awful and distracted final year of training. It IS bad for training, for emotional wellbeing, for our careers (indirectly). What are they going to do to right this wrong?
 
If rad onc programs are collecting and using recalls, they really should be ashamed of themselves, especially given the ass-kicking radiology took a few years back for this same unethical behavior. Sadly, I know exactly why and how this practice starts though: nobody, least of all the educators, knows exactly what sorts of things are going to show up on this exam and in what depth, and a scant ABR outline - assuming you even know where to find it - that says things like "residents need to know about DNA damage and repair pathways" ain't exactly specific enough.

My own approach is to ask residents who just took the boards a single question: "Was there anything on this exam that was completely unfamiliar to you, and that I don't already discuss in our course?" And if so, let's say for example, suddenly there were a bunch of questions on cancer nanotechnology and I don't have a lecture on that topic, I either find somebody familiar with the topic to give such a lecture the next time the course is offered, or else I sit down, research the topic myself, try to figure out what I think is the important content that a resident reasonably should know, and put together a new lecture myself. (Note that what I think is the important content isn't necessarily what the ABR thinks is the important content, although luckily, given my years of experience and that I'm a clinical radiobiologist by training, I'm usually fairly close.)

Sure, a lot of the more senior educators like myself do not keep up as well as they should (if at all) with more recent developments in radiation/cancer biology, and then promptly incorporate this material into their courses like I do, because basically, everybody's busy doing research and fighting for almost-non-existent grant funding, so the responsibility of teaching residents is WAY down on the priority list for most. So how are they supposed to know what the next big thing is they should be teaching, particularly if their own research interests aren't particularly clinically-oriented? And it's not as if the ABR even has a clue who most of these people are, let alone bothers to find out and contact them when they decide to arbitrarily change the curriculum. Not really making excuses for my colleagues, mind you, because if they took the responsibility of teaching seriously enough and had all the time in the world, they'd make the effort to seek out this information. Nor am I letting program directors off the hook either, because they sometimes do glean additional information about the boards...and never bother to pass it on to their own educators.

Admittedly, my situation is rather unique though in that I am not encumbered by an active research program, constant grubbing for money and supervising a lab full of people, so I do have the time to devote to keeping my course current. In fact, most people don't realize exactly how much time and effort it takes. Answer: a lot.

And who pays the price for all of this mess? The residents. Definitely not fair.
 
To get answers from the ABR you have to think like the ABR. Jump into Drs. Wallner and Kachnic's mindset. At the slightest bit of public questioning by Amdur and Lee they wrote a word salad of a mostly incomprehensible rebuttal (Have read it several times- actually still have no idea what they are saying). So their currency is justifying the ABRs existence and they are hyper sensitive of public criticism.

So with that in mind how do you get answers? You write about your dreadful experience with the exam- how you felt when you saw those questions out of left field, how it impacted you personally and professionally, the cost of retake, the lack of answers from the ABR, missing opportunities for learning what matters, missing family time etc. Then you send it to KevinMD, doximity, zdoggmd, and others for publication. These sites live off content that physicians produce. If you have seen these sites they thrive off Doctors versus The Man stories. This story is perfect. Then sit back and watch the ABR start to release statements, failure rates, and how they will address the situation (see wikipedia-public relations 🙂 )

I did not personally take the exam this year. I am certainly sympathetic with those that did and happy to help with strategy and writing. Would work best with several people involved. PM me if interested.

Sounds like a good idea. Also here’s to hoping there is a member of the resistance in Wallner’s office willing to write an anonymous op-ed in the New York Times


Sent from my iPhone using SDN mobile
 
How many others have written to ABR/Wallner about this issue? Can everyone who was surprised by their results, and even others who were satisfied with how they did but appreciate the major problem going on, send them an email, first of course being polite and respectful and thanking them for the time and effort that they are putting into looking into this, and then expressing your concern and surpsise at this years outcomes? Finally, ask specifically what is being done for THIS YEAR'S ppl. The response I got and my PD got were mostly about them looking into better study guides in the future etc, but ignored us PGY-5 ppl. I don't want to ruin my PGY-5 with this over my head if I really shouldn't have to! Please, as many as can, write to them, being kind first and foremost, but then strongly pressing on what are they doing to right the wrong that was done to our year. Its a matter of time, energy, cost, emotional wellbeing, and possibly even career.
After you've done so, post letting us know so we can see approx how many ppl aent him messages.
 
How many others have written to ABR/Wallner about this issue? Can everyone who was surprised by their results, and even others who were satisfied with how they did but appreciate the major problem going on, send them an email, first of course being polite and respectful and thanking them for the time and effort that they are putting into looking into this, and then expressing your concern and surpsise at this years outcomes? Finally, ask specifically what is being done for THIS YEAR'S ppl. The response I got and my PD got were mostly about them looking into better study guides in the future etc, but ignored us PGY-5 ppl. I don't want to ruin my PGY-5 with this over my head if I really shouldn't have to! Please, as many as can, write to them, being kind first and foremost, but then strongly pressing on what are they doing to right the wrong that was done to our year. Its a matter of time, energy, cost, emotional wellbeing, and possibly even career.
After you've done so, post letting us know so we can see approx how many ppl aent him messages.

I wrote a pretty respectful email earlier, and disappointingly but not surprisingly received a carbon copy of the response you (and I think someone else?) received. Crafted a sarcastic reply about how I don’t blame them for not individualizing the responses given how many “fairly” graded residents failed the exam this year, but glad I never hit send.

Since everyone will get the same reply, maybe a well crafted rebuttal representing all of us (perhaps by PDs) addressing each of Dr. Wallner’s (easily refutable) points with data would serve us better.

Wouldn’t hold your breath about retaking another exam earlier. I know ARRO is requesting it but I think the best we can do is just ensure a fair exam next year, which frankly is all I want. Not holding my breath on that one either though.


Sent from my iPhone using SDN mobile
 
Wow, checking back in after many years to see my thread from 2013 still raging strong (albeit for all the wrong reasons). After reading through the last few pages of this thread, it sounds as if the ABR has outdone themselves in their ability to write absurd tests. My heart goes out to all of you who were negatively affected by this and I know the prospect of spending the entirety of your PGY5 year worrying about this stupid test is awful.

I would say, however, that the odds of the ABR reversing course and offering a blanket pass for everyone this year is essentially zero. That would require an acknowledgement of total failure on their part and would entirely undermine their status as the sole arbiter's of our "readiness" to practice radiation oncology.

As unfair as it may be, I would advise everyone to move past the anger and denial phase and into the acceptance phase so that you can move along with life. I certainly support reaching out to the ABR, asking for clarifications, etc but I would not expect anything above a cursory "explanation". What I will say is that I know plenty of excellent physicians and physicists who have failed some part of these silly exams and all of them have moved on with their lives and are happily practicing in this great field. Most all of them can look back and laugh at the silliness of it all.

I guess my best recommendation, unsolicited as it is, is not to internalize any of this.

Good luck to all.
 
yo drop that number then, why are you trying to be part of the coverup?

what was it - 25% fail?
 
The field is so rotten with corruption, nepotism, connections, agendas. Some guy up there already heard the pass rate yet covered it up himself lmao. Think about anytime anybody writes anything against the powers, the red journal “blood bath” or the recent Wallner article. The agenda is overwhelmingly against anybody wanting a change for the better. They think most programs arent even worth existing. I know for a fact “top” programs with GIGABITES of recalls going decades for boards. some of you know what im talking about and know I’m correct. Ive seen it myself. We need this people out. Being around academics makes me so sick. So many rotten people in academics who win teaching awards yet are trully horrible and disasters for their departments as people and physicians. We all know these people they are a mess, cant even handle having more than 7 people on treat and people dread working with them. SAD.
 
So I stumbled across this article written by Wallner in 2015, which is rather eye opening:

https://www.redjournal.org/article/S0360-3016(15)00051-6/fulltext

I don't see how he can look at us with a straight face and tell us the exam hasn't changed and that the problem is due our competence. It's simply a lie.

I think this adds a lot of credence to what we have all suspected. There was a concerted effort to change the rad bio exam into a cancer biology exam. So why didn't the ABR tell us, tell PDs, or tell rad bio instructors? I think it's fairly obvious that this is a controversial issue (whether residents need to know molecular and celluar cancer biology at the PhD level in order to be competent clinicians), and their solution was to sneak content changes into the exam at the same time concealing the pass rate rather than tell us that the exam had changed beforehand. It's shameful.

Educational issues: Radiation oncology currently has an apparent advantage over other oncology specialties in the strength of its science and its high ratio of trained physician scientists (MD/PhD) to primarily clinically-oriented (MD only) providers (9). We must capitalize on this through creation of innovative training programs that enhance that ability of clinical scientists to build research-oriented careers. In this regard, departmental priorities and funding patterns may require modification. We must expand and reinvigorate our scientific base and build on this foundation to train young clinicians in translational radiation oncology. We need our mid-level and senior academic faculty to be better informed regarding the latest biological advances, which will support high quality clinical trials and grant applications. There must be a greater emphasis on the molecular and cellular basics of cancer biology in radiation oncology departments.

We walked into an exam that our teachers had been preparing us for for the first three years of residency the same way that had for decades. And what we found was this new PhD level cancer biology exam emphasizing the kind of molecular signaling pathways that Dr. Wallner precisely references in this article. This was not by chance. It is just a LIE to say that the exam was the same. How were we supposed to know these pathways to such a great level of detail without them telling us we needed to know it? How could he possibly think we "would know" these things without our radiobiology instructors even knowing that they would be tested?

Dr. Wallner, not every patient can be treated by a physician-scientist radiation oncologist at a major academic research center on the coast. There is an enormous need for quality oncologic care in small and rural communities throughout the country. We are doing a huge disservice to the field if we are now emphasizing translational and basic science training over clinical training. We do not need to know the finer details and trivia of every signaling pathway in existence in order to be a competent clinician, which is what your exam is supposed to assess. You might as well say that we need to know the finer details of quantum physics and throw in questions asking us to calculate Fourier transforms on the physics exam next year. Understanding the fundamental mathematics of electromagnetics and particle physics is after-all crucial for a doctor to treat a patient with a linear accelerator, right?

You created an exam that was only passed by people with a prerequisite cancer biology background from a PhD or extensive basic science research in residency, or those with access to recalls to ace recycled questions. For remaining 1/3 of us, you have wasted our time and impeded our clinical development in our final year of residency. Stop lying to us and telling us this exam hasn't chance. Stop patronizing us and telling us were are simply dumber. Your generation is quick to call us all complainers when we say something isn't fair. That doesn't preclude a situation where something really, objectively, is not fair. Which is what has happened here.
 
And furthermore, I have to believe that the extremely low physics pass rate is not by chance either. Artificially raising the bar on the physics exam so that there were a high numbers of failures on both exams certainly would support the conclusion that the failures were due to dumber/lazier residents rather than a dramatic change in the content of the biology exam.
 
So I stumbled across this article written by Wallner in 2015, which is rather eye opening

I don't see how he can look at us with a straight face and tell us the exam hasn't changed and that the problem is due our competence. It's simply a lie.

I think this adds a lot of credence to what we have all suspected. There was a concerted effort to change the rad bio exam into a cancer biology exam. So why didn't the ABR tell us, tell PDs, or tell rad bio instructors? I think it's fairly obvious that this is a controversial issue (whether residents need to know molecular and celluar cancer biology at the PhD level in order to be competent clinicians), and their solution was to sneak content changes into the exam at the same time concealing the pass rate rather than tell us that the exam had changed beforehand. It's shameful.

We walked into an exam that our teachers had been preparing us for for the first three years of residency the same way that had for decades. And what we found was this new PhD level cancer biology exam emphasizing the kind of molecular signaling pathways that Dr. Wallner precisely references in this article. This was not by chance. It is just a LIE to say that the exam was the same. How were we supposed to know these pathways to such a great level of detail without them telling us we needed to know it? How could he possibly think we "would know" these things without our radiobiology instructors even knowing that they would be tested?

Dr. Wallner, not every patient can be treated by a physician-scientist radiation oncologist at a major academic research center on the coast. There is an enormous need for quality oncologic care in small and rural communities throughout the country. We are doing a huge disservice to the field if we are now emphasizing translational and basic science training over clinical training. We do not need to know the finer details and trivia of every signaling pathway in existence in order to be a competent clinician, which is what your exam is supposed to assess. You might as well say that we need to know the finer details of quantum physics and throw in questions asking us to calculate Fourier transforms on the physics exam next year. Understanding the fundamental mathematics of electromagnetics and particle physics is after-all crucial for a doctor to treat a patient with a linear accelerator, right?

You created an exam that was only passed by people with a prerequisite cancer biology background from a PhD or extensive basic science research in residency, or those with access to recalls to ace recycled questions. For remaining 1/3 of us, you have wasted our time and impeded our clinical development in our final year of residency. Stop lying to us and telling us this exam hasn't chance. Stop patronizing us and telling us were are simply dumber. Your generation is quick to call us all complainers when we say something isn't fair. That doesn't preclude a situation where something really, objectively, is not fair. Which is what has happened here.

Here is the second letter I sent to Wallner, still no response yet. Maybe certain bits pushed too hard but I’m a darn stubborn person and not ready to give up to some dinosaur. I have also reached out to ARRO who told me they received about 100 replies to a tweet the posted about a week ago. I’m a freaking attending now and have finished residency, I’m so over dealing with this crap.

Dear Dr Wallner,

With all due respect, I do not think you understand my concern and situation. I currently cannot sign prescriptions or sign off on IGRT imaging and will not be able to do so for another year now. My state has an additional X-Ray Operator/Supervisor Certificate that I am required to obtain through the Department of Public Health and I cannot get this unless I have my ABR exams passed. I have spoken extensively with the Department of Public Health and there is no way I can take a distinct exam or obtain this license in another way. I am working in a smaller private practice and it places significant burden on my colleagues and my practice that I will not be able to do this for another calendar year.

I know I am not alone in expressing these concerns and I do not believe that simply allowing us to re-take the exam an entire year from now addresses the underlying situation. At this time I do not know how close I was to passing and I do not know how to address my weaknesses when I am able to retake the exam. The sections in which I need to improve included basic physics and advanced dosimetry. These categories are sufficiently vague such that it does not allow me to focus on the material I do not know. If I do not know what I don't know, how can I become a better radiation oncologist?

Therefore, I have several suggestions as to how to improve the situation going forward. The ABR could concede that this year was a bad exam year and pass all the residents due to poor testing material. I do not think that the poor performance on this exam lies within the explanation of a declining quality of residents in the field as your PRO editorial suggests and I think suggesting resident quality is based solely on exam performance is a myopic view of current residents and recent graduates. A good physician is defined by many other qualities than exam scores. Another option I would like the ABR to strongly reconsider is offering an earlier re-exam date, coupled with a course/module for both radiation biology and physics with build in self-assessment so we are able to gauge our performance and pinpoint areas for improvement. We should be confident that if we dedicate adequate time towards preparing for a repeat exam that we know the material well and will succeed.

I understand that the majority of candidates pass the exam, but what exactly is that percentage? How can you define a majority without numerical data? As a physician who has now failed a portion of the exam twice I realize I am in the minority of candidates, but I truly felt confident with my performance after taking the exam in July and do not know where I missed the mark. I want to work with the ABR going forward to make sure I pass when I have to re-take this exam and that this process is improved for those who follow me.
 
Top