Physics & Radbio

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.
It's probably the same issue in RO as well from a clinical standpoint. I trained at a smaller academic program in the middle of nowhere but it was well established with plenty of pathology including peds, srs/sbrt, interstitial gyn/h&n/prostate brachy etc.

What has happened with some of these newer smaller programs afaik is that they aren't getting the same breadth of cases leading to "fellowships" in things that many of us did during residency (igrt, palliation, srs etc). Brachy, proton and peds seem defensible as fellowships and that's imo

But is there evidence for that? Is it really residents from newer smaller US programs that are going into 'unnecessary' fellowships, proving that their experience is crap? I don't have a way to globally evaluate, for active and historical fellows, what residency programs they graduate from. Most RO fellowship websites don't list their fellows (in the way they list their residents).

My n=1 of somebody who did an 'unnecessary' fellowship (not brachy, proton, or peds) was a U of Wisconsin grad. Is that considered a 'newer smaller program'? They've had alumni since 1990.

I'm not saying that it's impossible, but this continual "small programs don't have the breadth of pathology" thing is kind of silly, especially for bread and butter (igrt, srs/sbrt, etc.). If there's any scenarios where it's an issue in regards to volume, it's likely brachy, proton, and pediatrics.

Members don't see this ad.
 
The whole idea of residents taking a rad bio exam, and then telling their instructors "hey you really should cover these additional 5-10 topics, since they were on the test this year and we didn't learn about them" is in the same spirit as a recall, right? Why is that felt to be OK?

I still want to see the data that stratifies exam results based on presence of a faculty member on the exam committee. I imagine that will also be positive. Obviously doesn't fit their narrative of 'hurr hurr small programs need to get geud', but that's what anybody who knows anything about research would ask in regards to their 'conclusions'.
To establish that this is actually what is occurring, i.e. better teaching on pertinent topics at larger programs, you'd expect the residents from these larger programs doing better than residents from smaller programs on both old and new questions. Is that data being shared at some point? If that data shows that residents from smaller programs performed as well or better than larger programs on new questions, what would be the conclusion in that case?
 
It's probably the same issue in RO as well from a clinical standpoint. I trained at a smaller academic program in the middle of nowhere but it was well established with plenty of pathology including peds, srs/sbrt, interstitial gyn/h&n/prostate brachy etc.

What has happened with some of these newer smaller programs afaik is that they aren't getting the same breadth of cases leading to "fellowships" in things that many of us did during residency (igrt, palliation, srs etc). Brachy, proton and peds seem defensible as fellowships and that's imo
I really think any department
To establish that this is actually what is occurring, i.e. better teaching on pertinent topics at larger programs, you'd expect the residents from these larger programs doing better than residents from smaller programs on both old and new questions. Is that data being shared at some point? If that data shows that residents from smaller programs performed as well or better than larger programs on new questions, what would be the conclusion in that case?
There is no way in hell that would ever be shared.
 
Members don't see this ad :)
To establish that this is actually what is occurring, i.e. better teaching on pertinent topics at larger programs, you'd expect the residents from these larger programs doing better than residents from smaller programs on both old and new questions. Is that data being shared at some point? If that data shows that residents from smaller programs performed as well or better than larger programs on new questions, what would be the conclusion in that case?

It was mentioned that Kachnic had slides on it, but took them out "for the sake of time" and then asked her ABR handler during the Q&A if they could share that info.

I'm not holding my breath for an in-depth explanation of differences in answer rates between old and new questions given the response that was given at ASTRO.

If the bolded is the case, then that means the only difference could have been on older questions... which means that recalls (either formally or informally in terms of telling the rad bio what to cover) in the bigger programs are the reason they passed and small program residents didn't.

Before this year, I thought the curriculum for rad bio was simple - know hall, do the study guide, study the rad bio practice exams. Know those things cold and you'll pass. I didn't realize that I was supposed to be bugging my more senior residents on what things were not taught and how I would go about getting that knowledge.
 
There is no way in hell that would ever be shared.

Exactly. Which is why Kachnic looked over at the ABR censors when the question came up at ARRO meeting and asked if she was allowed to answer that. They said they would have to look into it, which means they have to make sure that the data doesn't show that people from big and small programs missed the new material equally and that the only difference was on the old material.

It is common sense that the large programs are benefiting from prior resident recall, whether or not they have explicit strategies for producing recalls or if it's just residents casually mentioning what they saw.

Recycled questions need to go away.

If you're not absolutely infuriated by the recall situation, then I have to wonder if it's because you have benefited from them or expect to benefit from them.

You're losing it. Your only recourse is to take her and the ABR to court. That's it. Present the yearly trends, show the discrepency in 2018 and the retracted article; you will win in court. If not, at least you stood up for yourself the right way. Otherwise, study harder and forget this media campaign nonsense.

By all means, try and sue the ABR and get your $640 back. I don't think it's going to happen, and this isn't a good use of time and won't change anything. Unless someone can get a RICO suit into federal court (and if this isn't a racket, then what is?), which I have already been ridiculed for mentioning, lawsuits are not going to do anything.

If you want to do something, leak the recalls to CNN and put them online so the ABR can't use recycled test information that unfairly benefits large programs.
 
Exactly. Which is why Kachnic looked over at the ABR censors when the question came up at ARRO meeting and asked if she was allowed to answer that. They said they would have to look into it, which means they have to make sure that the data doesn't show that people from big and small programs missed the new material equally and that the only difference was on the old material.
In that case, hopefully the old/new question data is released since arguments are being made regarding small vs large programs which would require support from that said data. What was the purported rationale for not releasing that data? I understand that there may be some need to verify the data analysis but it seems to be in everyone's interest to ensure this issue is addressed as soon as possible, given that so many final year residents need to retake this exam.
 
difference in old vs new question percentage cannot be used to prove recall beyond reasonable doubt unfortunately.

You can make the argument that nobody attempted to teach to the new questions at all, while “better” didactics from bigger programs give residents at larger program an edge in the old questions.
 
Couple things I heard second-hand:

Wallner wasn't there.
Kachnic defended the position as said above. There was apparently some discussion of potentially moving the re-take to sometime a little bit earlier than clinical (so it wasn't back to back days), but nothing finalized.
Kachnic apparently frustrated that she's getting blasted for this as she's just the second in command for this.
No plans for mid year re-take.
Kachnic shut down suggestions of "simply increase the pass rate" to 85-90%.
Small programs (< 6 residents) only had statistically significant results compared to large programs (> 16 residents). There were 4 columns in terms of resident split, only the first and 4th column were 'significantly' apart.
Apparently some maybe better resources for study might come out? However, apparently when they state that 'cancer biology' or 'immunology' are testable concepts, they mean ALL of cancer biology and immunology.
Angoff method may have been influenced by increased clinical physician input into the exams.
No analysis of what percentage of old exam questions (40-60% any given year) were correctely answered.

My takeaways:
What a joke.
Apparently we're all only supposed to train at MSKCC, MDACC, HROP, Stanford, Emory, or other places with 16+ residents. Or expand programs until they hit 16 residents, at which point we'll just osmosis all the rad bio and physics we need from each other.
Kachnic is likely just the fall gal given Wallner didn't even have the decency to show up.
Better have a PhD in both cancer biology AND immunology AND whatever other topics are listed in the 'study guide' if you wanna be sure you know enough material to pass.
Wish somebody had discussed the hit piece Wallner and Kachnic co-authored in PRO as a conspiracy.

If your PD went to ASTRO, I encourage you to ask them directly what the proposed solution to any of this will be, as they can potentially post more details.

I heard direct recounts of the ADROP meeting confirmed by 3 different people who were there. The PD's were very supportive of the residents' positions, Bob Amdur in particular -- who lit into Kachnic -- and ended up storming out of the meeting, (not an exaggeration).
 
  • Like
Reactions: 2 users
I heard direct recounts of the ADROP meeting confirmed by 3 different people who were there. The PD's were very supportive of the residents' positions, Bob Amdur in particular -- who lit into Kachnic -- and ended up storming out of the meeting, (not an exaggeration).

So what is the next step? Is this just over?
 
PMID: 29907504

another commentary editorial on Amdur and Lee from Upenn. Mostly on the PW and LK camp, unsure what the point even really is.
 
So what is the next step? Is this just over?

The ABR has been presented with a number of requests and suggestions, including:

-retroactively re-visiting passing score threshold such that fail rate reduced to 10% (unlikely)
-providing mid-year exam availability, like radiology (unlikely in the near future as ABR has contract with Pearson and we are small potatoes compared to other licensing organizations that use them for standardized testing)
-providing improved study syllabus (unlikely unless faculty committee agree to do this)
-providing more detailed exam feedback, e.g. something more useful than "advanced dosimetry" or "special topics" (slightly less unlikely)

I think there were some others. Whether LK and PW and the new fish get the rest of the ABR on board with any of this up for grabs.


The time is ripe for revolution my friends....who wants to raise some capital and start a competing board certification????
 
So they shut even the most modest things down and we arent even getting a good study guide. WOW this is ridiculous and offensive. These people must be forced out. I hope that it wasn’t just one guy mad enough at ADROP (one walk out), otherwise it is going nowhere.

We need a new board and to break away from ABR.
 
So they shut even the most modest things down and we arent even getting a good study guide. WOW this is ridiculous and offensive. These people must be forced out. I hope that it wasn’t just one guy mad enough at ADROP (one walk out), otherwise it is going nowhere.

We need a new board and to break away from ABR.

People have complained about the ABR and the board certification process in our field for some time (4 qualifying exams). I wonder if there has been any discussion of moving away from the ABR entirely and creating a new certifying board. Why are we still under their control? They have demonstrated no willingness to work with us on this-- the biggest fear for all of us should be that this was not a one-and-done process. The ABR is clearly going to do whatever they want to us. Maybe it will continue to just be the class of 2019 that gets screwed (higher failure rates on clinicals, tougher oral boards). Or maybe this is a trend the ABR plans on keeping up for future classes...

If they aren't willing to work with us, why haven't our leaders (such as Dr. Amdur who is repeatedly the name brought up for his support of residents) considered breaking away? Now would be the time... Fix the issue with the ABR, fix the board certification exams and give power to the leaders of our field who deserve it.
 
  • Like
Reactions: 1 user
Members don't see this ad :)
I have a close connection to a well known CNN news reporter. I’m happy to leak a story and see where it goes. I still think continuing with a litigious pursuits is valuable.
 
Leak it but they must look into 21st century corruption and how their guy is in charge of our boards. Whole story is a story which reeks of corruption.
 
Just found this article on ABR quarterly newsletter, The Beam. It was written for the Diagnostic Rads intro exam, but you will see highly consistent parallels with Rad Onc.

“Focus on Residents

Conjecture and the Core Exam

by ABR Trustee Donald J. Flemming, MD

2018;11[3]:45-46

The purpose of this column is to discuss issues raised by program directors and others following the release of the June 2018 Diagnostic Radiology (DR) Core Exam results. Concerns surfaced when candidates and program directors started comparing program pass rates with each other. There was speculation on social media outlets that more than one third of the candidates had failed the exam. There was further conjecture that the exam is purposely made more difficult in order to increase failure rates. Even erroneous graphs, based on incorrect assumptions about the distribution of results for test takers, were constructed and published on blogs. On the other end of the spectrum, one program had internally congratulated one of its residents for having the "highest score in the country" based on a "statistical analysis" of the results. Unfortunately, the assumptions made to run this analysis were not valid.

In actuality, the pass rate for the June 2018 Core Exam was slightly lower than average, but it was not statistically different from the mean pass rate since the first administration in 2013. The failure rate for 2018 was 13.0 percent, which is almost identical to the failure rate of 12.9 percent in June 2015. It is important to recognize that the passing score for all ABR exams is based on a criterion-referenced standard, and therefore every examinee could pass the test if they achieve the standard. A panel of volunteers that includes program directors reviews each exam question to create this criterion-referenced standard. They are asked to score the likelihood that a minimally competent third-year resident would correctly answer the question. The cumulative results of this process are then used to establish the criterion-referenced standard for passing the exam.

It is important to remember that the Core Exam content is carefully crafted to make certain that it is appropriate for residents finishing their third year of DR training. Unusual diagnoses or uncommon presentations of common diseases are purposely avoided. The exam questions are vetted by volunteers at the time that a particular category selects its questions for that year's exam. Exam questions are subsequently reviewed at a test assembly meeting by a different panel of volunteers with representation from each category in diagnostic radiology. Content that is felt to be inappropriate can be, and often is, removed at each of these review steps. There is no organized effort by --- or direction from --- ABR leadership to make the exam more difficult. Content is chosen solely because it is felt to be reasonable and relevant information that a future board-certified radiologist would know.

Finally, the importance of engaging our critical scientific minds when we evaluate exam results cannot be overstated. The emotions associated with poor outcomes can allow cognitive biases to overwhelm the rational mind. Creating conclusions based on sampling bias (talking to a few colleagues) or the inappropriate use of statistics (generating graphs based on erroneous assumptions) is unacceptable in a science-based discipline. We should be as critical, or even more so, of information that we read in social media outlets as we are of that in our peer-reviewed literature.

Essentially, the Core Exam is thoughtfully constructed and reviewed multiple times by more than 100 volunteer radiologists before it is administered. The exam is appropriate for candidates completing their third year of residency training and the results show that it clearly and objectively identifies individuals who meet or exceed the criterion-referenced standard. It is natural to want to blame the exam when failure occurs. However, it is much more constructive, for the sake of both the candidates and their future patients, to focus on a rational solution to correct identified deficiencies.”


Sent from my iPhone using SDN mobile
 
In the main room, LK managed to say "for those who don't have the courage to come up front, there is an overflow room". Its one piece of the LK puzzle. pretty sure she is the bully, and a poor leader. reminds me of trump's attitude, to be honest.
Pretty sure you don’t know Lisa Kachnic. I trained with her many years ago. One of the few attendings who cared about residents and took care of them making sure they learned something. Perhaps you should act more professional. You don’t know her so you can’t be ‘pretty sure’ she is a bully. Perhaps you should re evaluate your career objectives.
 
I have a close connection to a well known CNN news reporter. I’m happy to leak a story and see where it goes. I still think continuing with a litigious pursuits is valuable.
What is there to leak? The ABR uses a validated system to grade their exam. This is developed by psychometric techniques. There is no curve. The intent is to set a passing threshold. In theory all can pass or all can also fail. This is how they serve the public. Like it or not, most professional exams use similar systems.
The public does not like doctors. There is no scandal here. The ABR will win in public opinion as their job, like a licensing agency, is to protect the public from unsafe physicians.
 
The ABR has been presented with a number of requests and suggestions, including:

-retroactively re-visiting passing score threshold such that fail rate reduced to 10% (unlikely)
-providing mid-year exam availability, like radiology (unlikely in the near future as ABR has contract with Pearson and we are small potatoes compared to other licensing organizations that use them for standardized testing)
-providing improved study syllabus (unlikely unless faculty committee agree to do this)
-providing more detailed exam feedback, e.g. something more useful than "advanced dosimetry" or "special topics" (slightly less unlikely)

I think there were some others. Whether LK and PW and the new fish get the rest of the ABR on board with any of this up for grabs.


The time is ripe for revolution my friends....who wants to raise some capital and start a competing board certification????
Good luck....you will never replace the ABR with a startup. You would need significant suooort from the senior radiation oncokogy community to do this. Remember they all passed the ABR exams and while the process has flaws it’s a necessary evil to prove you are a safe practitioner and well established in the US. Even the NBPAS is only focusing on MOC, not initial certification. It’s a tough process. You’ll study hard. But you’ll pass if you put the solid effort in.
 
Can't even imagine how malignant the oral boards will be in a couple years....traditionally that was the one test most of us worried about.

At this rate, they might as well just keep baseball bats in the examination hotel rooms for the examiners... would be a lot quicker for the examinees.
It’s not malignant. Examiners are turning over all the time. In my experience it was fairly practical. Sure it’s stressful....but at the end i felt i learned a ton and turned it into a positive experience to do better for my patients.
 
Can anyone offer any updates in terms of things that are going on behind the scenes? Is anyone working on more articles, ways to leak information, are there program directors or chair people who are interested in taking over leadership of the ABR? Are there any other outlets on which more articles are coming out? Or are we all conceding that it's time to throw up our hands, give up and dedicate all this time again to studying what we thought we already knew?
 
  • Like
Reactions: 2 users
Pretty sure you don’t know Lisa Kachnic. I trained with her many years ago. One of the few attendings who cared about residents and took care of them making sure they learned something. Perhaps you should act more professional. You don’t know her so you can’t be ‘pretty sure’ she is a bully. Perhaps you should re evaluate your career objectives.

Your defense of LK because 'you know her' and then trying to suppress discussion under the guise of 'be professional' is not going to work on this site.
 
Your defense of LK because 'you know her' and then trying to suppress discussion under the guise of 'be professional' is not going to work on this site.


This is important. I don't "know" LK, but anyone can make mistakes (including her and the ABR). As physicians, when mistakes are made, we come clean. We don't cover-up or deceive. That's why there is distrust. I have yet to see a statement from the ABR, LK or PW accepting responsibility for this year's blunder. On top of that, folks are getting agitated that people are expressing their right to free speech on a public forum.

Yeah, that's really a field I will tell medical students to be a part of.
 
ARRO is continuing to work with the ABR, ASTRO, and ADROP. We met with Dr. Wallner and the ABR separately during ASTRO and feel that we made some positive progress. We will be in contact with them and hope to update residents soon. Thank you all for your patience and please email [email protected]
 
  • Like
Reactions: 3 users
Thanks for the update ARRO. If you are able to help us in getting some tangible results in this giant mess through your continued work with the ABR, I/we will forever be ingratiated and indebted!
 
Long time lurker, first time poster. This is from the latest red journal. (emphasis below is mine)...

Whither Thou Goest, I Will Go
Paul E. Wallner, Stephen M. Hahn, MD, Anthony L. Zietman, MD

The title of this editorial is from Ruth I:16 (King James version of the Bible), and as with Ruth and Naomi, the disciplines of diagnostic radiology (DR) and radiation oncology (RO) were inexorably linked from their inceptions through the post–World War II era. In the late 1940s, Juan del Regato, MD, one of the early proponents of RO in the United States, corresponded with the American Board of Radiology (ABR) to request a separation of the disciplines within radiology; however, it was not until more than 3 decades later, in 1982, that the ABR administered its last general radiology examination. Qualifications for that examination required a minimum of 2 years of postgraduate training in DR and 1 year in therapeutic radiology (TR) (as it was known until 1987), or vice versa. It is important to note that the ABR did offer a certificate in TR when it initiated certification in 1934, but between 1934 and 1974, fewer than 20 certificates in TR were awarded in any given year, with most years seeing fewer than 10 being awarded (1). This is in contrast to the almost 200 RO certificates currently awarded annually by the ABR.

This issue of the journal is devoted to the incredible advances in RO enabled by adoption of increasingly sophisticated diagnostic imaging modalities. Unlike the early days of the introduction of equipment such as computed tomography (CT) simulators, which were often shared by the 2 disciplines, newer devices are more typically housed within the confines of the therapy department without immediate access to diagnostic colleagues and their input. Ironically, in the era that saw the introduction of CT, magnetic resonance imaging, highly selective interventional radiography, and computerized volumetric reconstruction of 2-dimensional imaging into the diagnostic armamentarium—as well as the very early exploration of the use of these modalities for therapy—the 2 disciplines were actually growing farther apart rather than becoming closer partners in clinical care.

The practice of RO is increasingly dependent on precise delineation of target versus nontarget tissues. At the same time, anatomic variations, patients with multiple comorbidities, many with imaging implications, and changes in motion and technique can cause significant alterations in image perception. We are increasingly concerned that training in image analysis, as is currently performed in many RO programs, is insufficient for present and future excellence in the practice of RO.

Before 1993, the Accreditation Council for Graduate Medical Education (ACGME) had a single residency review committee responsible for both DR and RO. Requirements for diagnostic imaging training in the RO program requirements, as promulgated by that final joint committee, specified only “access to diagnostic radiology,” without further details regarding actual curriculum content (2). Since 1993, accreditation of training programs and specification of program requirements for RO became the responsibility of the RO review committee. Some of this committee’s requirements, such as the precise number of new patients, simulations, and pediatric patients that each trainee must see before completing accredited training and becoming eligible for initial ABR certification in RO, have been prescribed specifically. By contrast, program requirements for training in all aspects of diagnostic imaging have no such specificity, indicating only that “the program must educate resident physicians in adult medical oncology, pediatric medical oncology, oncologic pathology, and diagnostic imaging in a way that is applicable to the practice of radiation oncology,” and that “there are multiple ways to meet this requirement,” including to “provide . . . a one-month rotation in both oncologic pathology and diagnostic imaging. . . ” (3). Practicing radiation oncologists routinely rely on pathology reports and must understand the elements and nuances of these reports, but many will rarely review actual pathology specimens for patient care decision making. Alternatively, information obtained from diagnostic imaging procedures is, by definition, integral to the routine daily practice of image-guided radiation therapy and all other RO procedures, except for clinical placement of cutaneous applicators.

The ABR is progressively adding additional image-related questions to its qualifying (computer-based) examinations in clinical oncology, including definition of normal and pathologic anatomy and appropriateness of provided treatment contours. As has always been the case, the certifying (oral) examinations are case management based, with an emphasis on image evaluation for usefulness in decision making regarding extent of disease, treatment planning, and follow-up. Over many exam administration cycles, it has become apparent that poor performance in both sets of examinations is often related to a lack of adequate knowledge in interpretation of the provided images, poor understanding of the added or diminished value of additional images and/or studies, and an inability to recognize differences in how images and disease processes are visualized based on technical alterations of the images employed for decision making. This concern has been previously noted (4).

In the absence of a formal ACGME-mandated curriculum for education in imaging for RO trainees, and with a clear understanding of the essential need for this knowledge set, the ABR plans to continue to add suitable imaging content to its RO qualifying and certifying examinations and the maintenance of certification online longitudinal assessment tool currently under development, with an anticipated rollout in 2020. In effect the ABR, through changing the examination, is, of necessity, forcing a change in curriculum. That being the case, we believe it is reasonable for appropriate stakeholders to develop a basic curriculum in the imaging education and skills necessary for radiation oncologists. The additional curriculum content could be provided in a variety of ways: (1) Requirements could be met by institutionally or departmentally established conferences, rotations, and/or didactic programming that would include all elements of the curriculum. (2) Regional teaching programs that would provide elements of the curriculum less likely to be available in smaller program-sponsoring institutions could be developed through shared resources. (3) “Crash courses” or “boot camps” could be established in centralized locations to provide some or all curriculum elements. This model has been tested for RO trainees in Canada with significant success (5). A similar model has been employed in the United States since the early 1960s to train DR residents in various aspects of their specialty not always easily available in their host institutions. Initially, these courses were provided at the Armed Forces Institute of Pathology, but following a reduction in federal funding for the institute’s programs, the course venue was shifted to the American College of Radiology through the American Institute for Radiologic Pathology. More than 20,000 DR residents have received training in the pathologic-radiographic imaging aspects of their discipline in this manner (6, 7).

Any of the approaches noted above could be undertaken by individual programs or groups of programs without ACGME-accredited program implications or external approvals. Other, more innovative programming, such as a side-by-side DR/RO oncologic imaging rotation with a shared core curriculum, could also be developed for cross-fertilization in both disciplines and offered in the postgraduate year 2 or 3 years. Even more intriguing would be to consider the more distant future of our specialty and try to imagine what the needs of our patients might be in the 2030s and 40s. No clinical specialty can be frozen in time; all experience evolutionary change, enlarging, branching, or shrinking. Interventional cardiology now eclipses cardiac surgery, a formerly dominant surgical specialty. Dermatology has evolved into a cosmetic as well as a clinical medical specialty. Primary care has demonstrated a progressive decline over the past several decades. It seems likely, if not inevitable, that some degree of merging between RO and other imaging and/or oncology-based specialties will, over time, evolve. This merging will not be a “top-down” revolution led by the ACGME, ABR, or the Society of Chairs of Academic Radiation Oncology Programs but rather a “bottom-up” revolution led by larger, more flexible programs, which will develop creative pilot approaches. The less creative or impractical pilots will wither, while those that fill a real need will thrive and propagate, to be later adopted into the mainstream. These changes could begin with cross-discipline fellowships, such as brachytherapy for interventional radiologists, interventional radiology for ROs, or nuclear medicine or cross-sectional scanning for ROs. These would not bring full certification but would allow practitioners to widen their practice and “diversify their portfolios.” Prostate brachytherapists, for example, might practice better if they could insert their own fiducial markers and spacers, offer additional energy-delivery treatments such as high-intensity focused ultrasound or cryosurgery, and biopsy the glands to assess response. Cross-fellowships would represent a “blurring of the edges” between specialties and a recognition that contemporary practice takes place in disease-focused cancer centers rather than “siloed” departments.

True hybrid programs blending DR and RO training from the outset have also been proposed. RO clinical training can now be completed in 27 months, as on the Holman Research Pathway. It may thus be possible to couple this abbreviated training with perhaps 16 months of nuclear medicine, or the 3-year DR core training, the latter being “thinned out” to reduce nononcologic work. It will be for individual institutions to define the program details and make proposals, but careful coordination with the ACGME and ABR will be essential to ensure that all requirements for program accreditation by the ACGME are met and that all requirements for initial certification eligibility by the ABR are fulfilled. Overall training would, of necessity, be longer than that for nonhybrid specialists, and it is likely that initially this will be a path chosen by few but with the potential for future growth.

Many RO programs have 6 or fewer residents, and we recognize that these suggestions may represent a challenge to many departments; some might even see them as an existential threat. Additional time devoted to image-based training may, realistically, require a reduction in time commitments to other elements of the program, entail time away from RO clinical responsibilities, likely produce a need for some time away from host programs, necessitate dialogue and rapprochement with our DR colleagues, and, potentially, add some cost to host departments. In an era of reliance on image guidance for RO, how can we not accept these modest burdens? A question beyond the scope of this editorial, but of intense current discussion and consideration, is how advances in artificial intelligence might affect any of these issues, and indeed, the clinical practices of RO and DR. A recent conference sponsored by the National Academies of Sciences, Engineering, and Medicine suggested that greater collaboration would also benefit DR trainees and providers (8). As did Ruth and Naomi, we must once again walk together.
 
Wow, very well written piece and that is what I expect from these guys. Let me translate:
We agree that it is totally obvious that the future is really f---ed up here because of continued residency expansion, bundling/hypofractionation/hospital consolidation. ( lets do the math together- less than one linac /100,000 people/15 patients/200 residents a year/40 year career/ 330 million population) So, lets try to adapt and shut down some programs and give our trainees skills other than external beam radiation so they can remain employed. While it may be logical to integrate with medical oncology, we are under the same board as radiology so it would be more practical to throw in some diagnostic or ir skills in to the field. In the meantime, places like upenn, which has the largest number of pts on beam- why should they have less residency spots than mskcc or mdacc; better add more.

I guess it is gratifying that there is some confirmation that not everything is fine, and maybe the "haters" have a point?
 
Last edited:
  • Like
Reactions: 1 users
Many RO programs have 6 or fewer residents, and we recognize that these suggestions may represent a challenge to many departments; some might even see them as an existential threat.

Paul Wallner can't stop talking about programs with 6 or fewer residents. Wonder if he ever thought that medical students that are the best at standardized exams tend to go to programs that have higher number of residents as these are more "prestigious" programs?

It always amazes me how radiation oncologists can dissect a clinical trial line by line and find flaw after flaw but lose all critical thinking ability when it comes to supporting a theory that they hold close to their heart.

The performance gap between small and large programs will widen significantly over the next few years. Small programs will be the first hit by the declining pool of applicants. They will get more of the SOAP students and students that may have self-selected out in previous years. It will take a few years longer for the big name, large resident number programs to feel the effect.

It was an effective strategic decision to prove his point to those of us who cannot think critically. Unfortunately, in the process he helped hasten the decrease in competitiveness of this specialty. The tidal wave of negative sentiment about this specialty significantly increased after this fiasco and will significantly affect applications for the next cycle.

Despite the uproar about this exam, there have been no meaningful changes. Residents are still not even sure what to study. Future applicants are aware of this lack of action as well.

Academia has proven that they cannot solve problems caused by academia. The cure is often worse than the disease.
 
  • Like
Reactions: 1 users
Wallner is regulating the market. Now he can hire a radiation oncologist at 21st century for 60k. Price of an oncologist just went down.

Paul Wallner can't stop talking about programs with 6 or fewer residents. Wonder if he ever thought that medical students that are the best at standardized exams tend to go to programs that have higher number of residents as these are more "prestigious" programs?

It always amazes me how radiation oncologists can dissect a clinical trial line by line and find flaw after flaw but lose all critical thinking ability when it comes to supporting a theory that they hold close to their heart.

The performance gap between small and large programs will widen significantly over the next few years. Small programs will be the first hit by the declining pool of applicants. They will get more of the SOAP students and students that may have self-selected out in previous years. It will take a few years longer for the big name, large resident number programs to feel the effect.

It was an effective strategic decision to prove his point to those of us who cannot think critically. Unfortunately, in the process he helped hasten the decrease in competitiveness of this specialty. The tidal wave of negative sentiment about this specialty significantly increased after this fiasco and will significantly affect applications for the next cycle.

Despite the uproar about this exam, there have been no meaningful changes. Residents are still not even sure what to study. Future applicants are aware of this lack of action as well.

Academia has proven that they cannot solve problems caused by academia. The cure is often worse than the disease.
 
Paul Wallner can't stop talking about programs with 6 or fewer residents. Wonder if he ever thought that medical students that are the best at standardized exams tend to go to programs that have higher number of residents as these are more "prestigious" programs?

It always amazes me how radiation oncologists can dissect a clinical trial line by line and find flaw after flaw but lose all critical thinking ability when it comes to supporting a theory that they hold close to their heart.

The performance gap between small and large programs will widen significantly over the next few years. Small programs will be the first hit by the declining pool of applicants. They will get more of the SOAP students and students that may have self-selected out in previous years. It will take a few years longer for the big name, large resident number programs to feel the effect.

It was an effective strategic decision to prove his point to those of us who cannot think critically. Unfortunately, in the process he helped hasten the decrease in competitiveness of this specialty. The tidal wave of negative sentiment about this specialty significantly increased after this fiasco and will significantly affect applications for the next cycle.

Despite the uproar about this exam, there have been no meaningful changes. Residents are still not even sure what to study. Future applicants are aware of this lack of action as well.

Academia has proven that they cannot solve problems caused by academia. The cure is often worse than the disease.

So scandalous and true. There is no people in “leadership” who are problem solvers. Its crazy that next exams are approaching in 6 months and people have no idea what to study. The case seems to be closed with no resolution.
 
Well, HAS anything happened? ARRO, it has been a few weeks since your last encouraging comments. Any news you can update us with?
 
Got this e-mail today. Overall, I think it's a step in the right direction. I think the concept of potentially allowing PGY-3s to crank out rad bio/physics is good (why force it only on PGY-4s?) but only serves to greater highlight how these board exams are more like a check box than anything useful, if we're saying that somebody with less than 2 years in Rad Onc residency can take a board exam that defines 'clinical competence' as per ABR's mission.

Second Letter from ARRO to ABR: letter summarizing this discussion
Second ABR response to the above letter: ABR's reply

For anyone who can't open the links, major content (in its entirety) found below:

Relevant bits from letter from ARRO to ABR:
Eligibility for the Examinations
• Dr. Wallner agreed to propose to the board to expand eligibility to sit for the radiation biology and physics qualifying examinations to include PGY-3 and PGY-4 residents. This would allow for program directors and residents to agree upon the best time for a candidate to sit for these examinations. This would be particularly helpful for the programs that alternate didactic courses yearly. In addition, should a candidate fail an examination, it would be possible for the resident to retake the examination a subsequent year without conflicting with the qualifying clinical exam taken after graduation.
2. Providing Feedback to Examinees
• ARRO has heard from numerous residents that they are unclear how to better prepare for the re-examinations. Dr. Wallner and Dr. Kachnic both agreed to provide more granular feedback regarding examination performance in specific topics to examinees and program directors. We believe this information is critical in order for residents and educators to understand the knowledge gaps that may be present regarding what is considered to be “foundational knowledge.”1
3. Transparency and Statistical Validity
• We appreciate the ABR’s willingness to provide greater transparency regarding examination development and scoring. Both Dr. Wallner and Dr. Kachnic stated they would release data of examinee performance on both the validated “old” questions and the “new” questions utilized in this year’s qualifying examinations subgrouped by residents who passed and failed the overall exam.

ABR's response to ARRO:
Development of core curricula for Physics and Radiation and Cancer Biology: The ABR has no responsibility or authority to develop curriculum and does not provide educational content, but we certainly have a critical interest in this issue. At an ASTRO-ABR Leadership meeting in San Antonio an agreement was reached that ASTRO would convene a panel to update the existing Physics Core Curriculum and a second panel to develop a Radiation and Cancer Biology Core Curriculum. These panels will include appropriate stakeholders and we have encouraged ASTRO to include ARRO representatives in the process. Our understanding is that steps to accomplish these goals have begun, with an anticipation of rapid completion and availability. The consensus-developed curricula will be linked on the ABR website.

• Improvement in the ABR basic science study guides: Our Physics and Radiation and Cancer Biology Committee chairs have committed to a timely review of the available guides with an eye to improving their usefulness to candidates by providing greater specificity and granularity to the material. We anticipate completion of that process shortly after completion of the core curriculum development noted above. The revised study guides will be posted in full on the ABR website as will reference sources from which exam items will be developed.
• Timing of administration of the 2019 Physics, Radiation and Cancer Biology and Clinical Oncology qualifying exams: The ABR understands that administering the exams on successive days may increase candidate anxiety, but the current dependence on the nationally disseminated PearsonVUE test centers severely limits flexibility with regard to scheduling. As we indicated in San Antonio, we did approach PearsonVUE for consideration of a greater interval between the exam administration in 2019, and this option was rejected by them. We have rescheduled the 2020 exams to provide two weeks separation between the basic science and clinical exams. The potential to administer all (or selected) qualifying exams at the ABR Test Center in Tucson is being investigated. This centralization of administration under the ABR roof could provide greater flexibility in scheduling, but as we noted, might cause significant hardship to candidates, requiring travel to, and lodging in Tucson. We await further insight from ARRO and its constituents before we proceed any further with consideration of this option.
• Development of greater granularity of basic science categories to optimize ability of candidates to understand their strengths and weaknesses: As we indicated, we are committed to improving candidate’s ability to improve performance. We will provide additional granularity in our method of reporting basic science category performance, but as we explained previously, the exam blueprint is such that some of these categories will have so few items as to make analysis of individual performance impossible. Re-classification of the categories will be available on the ABR website before Dec. 31, 2018.

• Analysis of candidate performance on old versus new basic science items: As is always the case, some exam items have been previously used and have available performance statistics. This review is being undertaken and results should be available before Dec. 31, 2018. The results of this analysis will be provided to the respective executive committees through their senior volunteer leaders.


Of course, one final dig at SDN (bold emphasis mine):

We encourage appropriate dialogue to improve transparency of the exam process going forward, but would encourage all ADROP, ARRO and SCAROP constituents to carry out this dialogue through their respective organization representatives. Do not hesitate to contact us if you desire additional clarification.
Sincerely,
Brent J. Wagner, MD
President, ABR Board of Governors
Lisa A Kachnic, MD
Immediate Past President, ABR Board of Governors
 
TL;DR- Meetings, committees, acronyms, self-congratulation, further analysis, hours and hours of lost productivity and no changes or solutions.

Exactly what would be expected from academic radiation oncology.
 
  • Like
Reactions: 1 users
So practically speaking, for us in 2019, what we can expect is:
1. Better study guides
2. References of sources from which the material will come from. This can be really helpful given that a lot of us are feeling in the dark about what to study.
3. Possible option of an early test date in Tucson, which is definitely not ideal but may be helpful.
4. More data from our tests. Can also be helpful in terms of identifying weaknesses.

Not all of us are up to date on when these things are coming out (I didn't know about these letters). Whenever someone finds put something, can you please post it to the group?
Thanks for all of this.
 
Oh, of course this all still means we waste a ton of time studying all this stuff again. Huge waste of my life.
 
  • Like
Reactions: 1 user
So practically speaking, for us in 2019, what we can expect is:
1. Better study guides
2. References of sources from which the material will come from. This can be really helpful given that a lot of us are feeling in the dark about what to study.
3. Possible option of an early test date in Tucson, which is definitely not ideal but may be helpful.
4. More data from our tests. Can also be helpful in terms of identifying weaknesses.

Not all of us are up to date on when these things are coming out (I didn't know about these letters). Whenever someone finds put something, can you please post it to the group?
Thanks for all of this.

Despite the date on the letter being 11/14/18, I just got it in my e-mail inbox today.
 
TL;DR- Meetings, committees, acronyms, self-congratulation, further analysis, hours and hours of lost productivity and no changes or solutions.

Exactly what would be expected from academic radiation oncology.

Sounds like the usual circle jerk. More comittee chairs, more honours. No practical solutions. I dont read it positively as another poster. There is no specific indication things will be available or solutions will be offered to those affected. I wonder, what is a reasonable time to put out the material to allow people to digest it? 6 months? A month? A week before the exam??. The fact that it is less than a year until the next exam and they changed the exam without telling people, screwing a whole class, a year before is so rotten!
 
Wallner is regulating the market. Now he can hire a radiation oncologist at 21st century for 60k. Price of an oncologist just went down.

Pretty sad that one man, one old portly man, can cause a massive disruption to the future of our field. I’ve heard from 4 students who will not apply any longer bc they think they will fail boards after investing 4 years. What is his term limit, why does he have no term limit, who will hold him and Kachnic accountable for jeopordizing the entire field
 
Last edited:
Guys, it's clear that you are unsafe to treat cancer patients with radiation unless if you can recite the biomolecular basis of immunology verbatim. I use this knowledge at least 20 times per day in the clinic. Forget about appropriate target volumes, management of toxicities, and common dose constraints. None of that matters. What really matters is ingrained memorization of every oncogene pathway and the PFS percentages of some random RTOG trial designed 20 years ago that has dubious relevance today.
 
  • Like
Reactions: 6 users
Guys, it's clear that you are unsafe to treat cancer patients with radiation unless if you can recite the biomolecular basis of immunology verbatim. I use this knowledge at least 20 times per day in the clinic. Forget about appropriate target volumes, management of toxicities, and common dose constraints. None of that matters. What really matters is ingrained memorization of every oncogene pathway and the PFS percentages of some random RTOG trial designed 20 years ago that has dubious relevance today.

I agree, good thing we got all those fellowships out there to fill in some of those knowledge gaps. Maybe they should look into adding a Radbio and Physics fellowship.
 
Dr. Wallner and Dr. Kachnic,

I know you are reading these messages, so I'd like to address a couple of questions to you directly:

1. Approximately 45% of residents taking physics and bio failed at least one exam this year. Your attempts to minimize this number by saying things like "most residents passed," only reporting the numbers for first-time test takers, and saying that "most residents who failed failed both exams" are disingenuous. We know that the number of residents who failed at least one exam is not in the 25-30% range as you are trying to spin it. Statistically it's in the mid 40s. Why won't you release the detailed examination performance and the true numbers?

2. Do you really believe that ~45% of current PGY-5 residents are not minimally competent in bio and/or physics to safely practice clinically? Do you really believe that your exam content and scoring tests for minimal competence? Do you not trust the ACGME to appropriately accredit residencies? These exams should be formalities. The only people failing these exams should be people who lack fundamental knowledge and don't understand basic things like how to perform a BED calculation or know how to select an appropriate electron energy to treat a given superficial tumor. These exams as written are essentially a memorization contest. Virtually nobody in clinical practice can off the top of their head tell you the difference between DNA ligase III and DNA ligase IV. So why are you testing minutiae like this and using it to prevent people from becoming board certified? This really demands explanation. I believe you have set the passing standard unreasonably high, and I think that anybody looking at this situation objectively would reach the same conclusion.

3. Do you understand, recognize, and/or care about the impact a board exam failure has on current trainees? Your statements to date seem to indicate that this isn't really a big deal and just a minor annoyance. Your lack of empathy is incredibly frustrating. It's a scarlet letter. When we have to explain to our colleagues and employers that we failed a board exam, there is absolutely no way we can convey the unique and complicated context involved in a 2018 exam failure. Allegations and suspicions of incompetence will follow us forever. This is like a judge giving someone who gets a speeding ticket a class 1 misdemeanor and just saying "You know, the reality is that you're not a safe driver. Just pay a $2000 fine and spend a weekend in jail, go to driver re-education next year and get on with your life -- no biggie." Wrong. Your license will be restricted for a year and now you've got a permanent criminal record. The premise for the punishment is fallacious, and the severity of the punishment and collateral consequences is not just.

4. Why do you think it's ok to used recycled test questions? Do you not understand how this engenders cheating in larger programs? Do you not understand how this could influence pass rates at small vs. large programs? Do you not understand how recall use could lead to falsely elevated expectations among test makers regarding what test takers "would know" and lead to creep of question difficulty with each passing year and ultimately increased failure rates at programs without recalls? Why don't you simply create an exam with new content each year and prevent this problem?

5. Please offer a solution for correcting this other than having to take the exams at the same time as the clinical. The fact that this was done in the past isn't really an acceptable answer. Lots of things were done in the past that were unideal. We learn and evolve and make the process better as we go along. This is the nature of humanity. Just because something was done in the past doesn't inherently mean that it was ok. Additionally, pass rates in the past were >90% as you had a more reasonable bar set back then.

Looking forward to, but not expecting, your response.
 
I don't really have any personal or emotional stake in this year's tests. But to me it's clear that somewhere along the way the underlying purpose of the physics and biology board exams has been lost. They are not accurately assessing minimal clinical competence.

You can use a well validated process like the Angoff method, all the volunteers can be perfectly competent and well-meaning, etc etc...but a validated process and good intentions does not automatically equate to an acceptable result.

No rational individual can defend that this year's exam accurately reflects the capability of examinees to be in clinical practice. Therefore by definition the implementation of the Angoff method is completely wrong. That's the bottom line and that's what the ABR should be aiming to fix. Better study guides and more detailed score reporting are completely and utterly missing the point.
 
  • Like
Reactions: 4 users
I don't really have any personal or emotional stake in this year's tests. But to me it's clear that somewhere along the way the underlying purpose of the physics and biology board exams has been lost. They are not accurately assessing minimal clinical competence.

You can use a well validated process like the Angoff method, all the volunteers can be perfectly competent and well-meaning, etc etc...but a validated process and good intentions does not automatically equate to an acceptable result.

No rational individual can defend that this year's exam accurately reflects the capability of examinees to be in clinical practice. Therefore by definition the implementation of the Angoff method is completely wrong. That's the bottom line and that's what the ABR should be aiming to fix. Better study guides and more detailed score reporting are completely and utterly missing the point.
Very well said. I also have no personal stake and I completely agree. I do not think other individuals that have no personal stake understand the significance of the fallout over this exam

There is an old saying "What you fear, you create". Paul Wallner had an irrational concern about declining resident quality. This exam under his watch will unquestionably result in declining resident quality. If some are concerned about the 20% decline in resident applications this cycle they will be shocked by next year's numbers.
 
Agree that exam content is the most serious problem. Theoretically someone deficient in core rad bio knowledge could have passed because the exam was focused on irrelevant minutiae rather than fundamental key concepts (e.g., someone with a PhD in molecular biology or someone who memorized recalled questions -- although shouldn't our accredited residency programs prevent such a rare incompetent resident from even graduating rendering these exams pointless?). However, this was addressed by the ABR, who defended the content by saying that question writers were actual clinicians and verified through a process where everyone agreed to what degree questions assessed minimal competence. Again, basically washing their hands of the whole thing and saying that by following the rules of the process (that they created), the result is valid because the rules were followed.

Paul Wallner has lamented repeatedly that "nobody wants to hear" that small programs have deficiencies in resident in quality and teaching. Isn't it funny how he and the ABR don't want to hear that the exam content may not be an appropriate assessment of minimal competence and is unjustly failing competent residents? I mean, they won't even consider the possibility.

127024-126515.jpg


That said, it still doesn't mean their aren't legitimate concerns from those of us who are left with no idea how to prepare for their exam and seek transparency in examinee performance and scoring criteria.

My understanding is that the exams are created 1.5 years in advance. I.e., the 2019 exam had already been generated with passing standards already established before the 2018 exam was even administered. I have little faith they will take any corrective action on exam content for this year.
 
Just heard about the class action against the ABIM for monopolization of initial certification and maintenance of certification. Rings familiar....
 
Is that wishful/hopeful speculation or are you coming from somewhere?
 
Can someone send out information on the class action lawsuit against the ABIM? Are there any articles on this? I would be interested in knowing what the specifics are, what law firm is carrying it out, etc.
 
Top