FoughtFyr said:
Really, an important study? That many ignore? It shows that of thirteen people, five MDs and 8 DCs, interrater agreement was a low as 0.44 (among the five chiropractors not identified as chiropractic radiologists). Now a kappa of 1.0 means complete agreement, and they scored a 0.44 (meaning they agreed on the finding of only 44% of the films). Second, look at the study design itself, 13 people looked at 300 x-rays to detect an abnormality (present in 50 films). So what! I have said that I do not doubt, necessarily, a DCs skill in NMS, but in non-NMS conditions. Besides, with thirteen participants, I really question the power of the study.
And lastly, lets look at some conclusions here. "The intraobserver agreement showed mean kappas of 0.58, 0.68, and 0.72, respectively. The difference between the chiropractic radiologists and medical radiologists was not significant.
However, there was a difference between the chiropractors and the other professional groups. {emphasis added}. "The medical radiologists were more specific than the others." "Good professional relationships between the professions are recommended to facilitate
interprofessional consultation in case of doubt by the chiropractors." {emphasis added}. Yep Rooster, this is an important study that we ignore. The AMA now has a hit out on you for bringing it to out attention!
- H
A companion study:
Spine. 1995 May 15;20(10):1147-53; discussion 1154.
Interpretation of abnormal lumbosacral spine radiographs. A test comparing students, clinicians, radiology residents, and radiologists in medicine and chiropractic.
Taylor JA, Clopton P, Bosch E, Miller KA, Marcelis S.
Department of Radiology, University of California, Medical Center, San Diego, USA.
STUDY DESIGN. Controlled comparison of radiographic interpretive performance based on training and experience. OBJECTIVES. This study compared each of these groups in medicine and chiropractic by testing abilities to interpret abnormal plain film radiographs of the lumbosacral spine and pelvis. SUMMARY OF BACKGROUND DATA. Low back pain is a common and costly problem that is evaluated and treated primarily by medical physicians, orthopedists, and chiropractors. Although radiology is used extensively in patients with low back pain, the radiographic interpretations of students, clinicians, radiology residents, and radiologists have never been compared. METHODS. Four hundred ninety-six eligible volunteers from nine target groups completed a test of radiographic interpretation consisting of nineteen cases with clinically important radiographic findings. The nine groups included 22 medical students, 183 chiropractic students, 27 medical radiology residents, 13 chiropractic radiology residents, 66 medical clinicians (including 12 general practice physicians, 25 orthopedic surgeons, 21 orthopedic residents, and 8 rheumatologists), 46 chiropractic clinicians, 48 general medical radiologists, 55 chiropractic radiologists, and 36 skeletal radiologists and fellows. RESULTS. The test established a high level of internal consistency reliability (0.880) and revealed that, in the interpretation of abnormal plain film radiographs of the lumbosacral spine and pelvis, significant differences were found among professional groups (P < 0.0001). Post hoc tests (P < 0.05) revealed that skeletal radiologists achieved significantly higher test results than did all other medical groups; that the test results of general medical radiologists and medical radiology residents was significantly higher than those of medical clinicians; that test results of medical students was significantly poorer than that of all other medical groups; that the performance of chiropractic radiologists and chiropractic radiology residents was significantly higher than that of chiropractic clinicians and chiropractic students; that no significant differences was revealed in the mean values of performance of chiropractic clinicians and chiropractic students; that the test results of chiropractic radiologists, chiropractic radiology residents, and chiropractic students was significantly higher than that of the corresponding medical categories (general medical radiologists, medical radiology residents, and medical students, respectively); that no significant difference in test results was identified between chiropractic radiologists and skeletal radiologists or between chiropractic and medical clinicians; and that the length of time in practice for clinicians and radiologists was not a significant factor in the test results. CONCLUSIONS. These data demonstrate a substantial increase in test results of all radiologists and radiology residents when compared to students and clinicians in both medicine and chiropractic related to the interpretation of abnormal radiographs of the lumbosacral spine and pelvis. Furthermore, the study reinforces the need for radiologic specialists to reduce missed diagnoses, misdiagnoses, and medicolegal complications.
PMID: 7638657 [PubMed - indexed for MEDLINE]
The diagnosis of what is causing back pain is often made by interpreting x-rays. But the type of health care provider interpreting those films may have a great influence on what is found (or missed). When various providers were tested for the ability to interpret abnormal lumbosacral spine radiographs, here is how they placed:
(horizontal graphs using x-ray graphic in different colors)
Provider Group Average correct
Chiropractic Radiologist - 71%
Skeletal Radiologist & Fellows - 70.18%
Chiropractic Radiology Residents - 61.54%
General Medical Radiologists - 51.64%
Medical Radiology Residents - 44.64%
Medical Clinicians - 31.26%
Chiropractic Clinicians - 28.38%
Chiropractic Students - 20.45%
Medical Students - 5.74%
While one would expect a difference between the provider groups, specialists appear to be twice as accurate as non-specialists.
----------------------------- SOURCE: Taylor JAM, et al. Interpretation of abnormal lumbosacral spine radiographs. Spine 1995;20:1147-1154.