MacGyver said:
AMA has virtually NO CONTROL over the creation of new med schools. As long as a proposed med school meets the established LCME criteria, AMA doesnt have anything to say on the matter.
The LCME and AMA are out of hte loop. These med schools are going forward and the AMA has no power to stop any creation or expansion of med schools.
That may be true today, but in its infancy in the US, the AMA did indeed, through the Hopkins/Osler philosophy of medical training incorporate the Hopkins idiology. Note that this took place 120 years ago in the late 19th century. At that time, there were many medical schools in the US, there were no requirements for licenses in the states and territories. Anyone who could afford to pay for the education could get a seat in one of the many medical schools of the day. Which brings up another question. Who controls the LCME? When AUC wanted to open a domestic campus in Cheyenne, the LCME shot them down. Wyoming has very little political clout and no medical school. The LCME is decidedly NOT out of the loop, since if you do not graduate from an LCME accredited school, you are not eligible to sit for the USMLE in most states. You have to go through the ECFMG, and they require government recognition.
The AMA pushed, first in the Eastern States, for the creating of licensing for physicians. Once several of the states established licensing requirements, then they pushed around the end of the 18th Century for acrediting medical schools. They chose as their model, the Hopkins/Osler model. Then, once they had a few colleges "accredited," they lobbied with increasing effectiveness to limit licensing of physicians in the states and territories to those graduating from "accredited" schools. This created a major conflict between the non-AMA and the AMA medical schools and many of the non-AMA schools disappeared rapidly. It also created the major rift between osteopaths and allopaths that exists to this day.
As the AMA schools by virtue of the licensing laws, began to exert increasing influence, the non-AMA schools tried to obtain accreditation but were denied. By 1910, the AMA was bold enough to say they had to limit seats not only for quality education but to insure that the market was not flooded and wages diluted. Then came internships, then residencies of increasing length. Residencies and residency positions were not paid for by medicare for many years. They were paid for by the hospitals.
In the 1940s during WW-II there was a wage freeze on all workers in the US. Industry in an attempt to attract talented workers began offering "fringe benefits" outside of wages to attract new people. They were called fringe benefits because they operated at the fringes of the wage-price controls in effect during the war. After the war, these benefits, including paid vacation and notably health insurance became the expected norm in addition to salary.
In the late 50s-early 60s, the insurance companies originally paid claims to policy holders and not doctors. Then the Blue-Cross Insurance Company came up with the idea that they could pay doctors directly to avoid delay a policy holder experienced between the time they paid the doctor and the time they were reimbursed by the insurance company. About a decade later, the insurance companies realized that since they now paid docs directly, they could negotiate better rates with individual doctors and thus began the insurance company practice of discounting. Since they didn't have to deal with irate policy holders who wondered where the "reasonable and customary" rate that said a doc got paid 1/3 of what his bill was, and since docs didn't want to alienate their longstanding patients, they got away and are getting away with increasingly outrageous practices.
In the mid 1960s Lyndon Johnson proposed a massive social program called the Great Society which ushered in Medicare. Initially Medicare did not pay for any residency training. As more people became beneficiaries, medicare faced increasing disparity between revenue collected (as part of the social security tax (or the Federal Insurance Contribution Act -FICA), and faced the real prospect that the underfunded program would have to cut expenses. So, they capped payment rates to hospitals which helped some and staved off the more serious problems until the end of the 1970s. Medicare Part B the doctors payment benefits came along later and doctors who participated agreed to a cap on payments for services.
Prior to this , hospitals and doctors had provided in many cases, much of the charity care for indigents as overhead. As medicare began paying hospitals less, they began shifting costs of indigent care and medicare underpayments to those with private insurance and cash payers. Quickly the insurance companies began to see that this was a problem and balked, responding by offering hospitals increasingly smaller fractions of their increasingly larger bills.
This brought about the DRG schema of the mid-1980s. The DRG says that medicare would pay only for a diagnosis related group and would pay a flat fee, say $3000 for an uncomplicated pregnancy, prenatal care, imaging and L&D. If you went over that, too bad, under that you made a profit. This ushered in another form of health insurance the HMOs. They worked on a similar scheme which created a capitated payment of $X/covered life, and made the physicians (usually the primary care docs) responsible for covering all of the medical costs out of that pool. If you spent less, you profitted, if you spent more, you lost money.
Hospitals adapted by cutting costs and cutting charity care. Teaching hospitals claimed that since they could no longer cost shift, that residency training was a severe expense and it wasn't fair that they couldn't charge for that in the DRGs, since that was a hidden "cost." Congress responded by allowing teaching hospitals to recap the "costs of teaching" which turned out to be a big boon to teaching hospitals and universities, and thus began the time when Medicare started paying for residency positions.
The specialty societies began to be concerned about competiton and the ability to drive down wages of their members, so acreditation and board certification became the norm over the past 100 years or so. Today, without board certification it is difficult or impossible to earn a living in medicine. There are only so many seats in medical schools, in residency and they ABMS/COTH/LCME/ACGME all have a substantial say in how many programs, how many seats and how many years. And the programs can destroy a resident on a whim, if they are so inclined.
So, what if the seats were available? If instead of medicare paying for residency positions, the resident would pay for their spot instead? Do you think the cap on seats would be raised? What is the true cost of a residency position, if you calculate a fair market value for the labor a resident produces for a hospital?
And so, with that background, moving on to your next question:
MacGyver said:
One adjustment is that Medicare would go bankrupt trying to fund unlimited residency positions at 100k per resident per year. Politicians would never let that stand.
Why does medicare pay for it at all? IF a residency program has intrinsic value, let those who will profit from it make the investment. IF society as a whole profits by it then, yes, the government (ie we the people) should indeed fund a portion or all of it. But, controlling the number of seats is a "managed" economy, reflective of the failed economies of eastern Europe and The USSR. The specialty guilds (boards) have a tight grip on the number of residency seats because it insulates them from the ravages of the market.
If there is intrinsic public benefit, let medicare pay for some positions or a portion of all positions and why not make any additional positions up for grabs to the highest bidders? IF there were not sporadic shortages of physicians and specialists, would not medicare ultimately pay less for services? And wouldn't that benefit society? (except for those members of society who are cardiothoracic surgeons)
If a resident truly wanted to be a dermatologist, and all the "free spots" were taken, and the resident was willing to borrow enough money to buy a spot, knowing that his $200k/year salary would pay it back in short order, why not let him? Of course then the salaries would drop until a cost/earnings equilibrium point was reached, which might be 40% less than as things stand now.
The hospitals lobbied for compensation for residency program payments as a result of revenue shifting based on the DRGs and other later prospective payment systems, and cleaned up.
MacGyver said:
Doubtful. Medicine is not a free market. There is no competition based on price. Medicare sets the de facto industry standard reimbursement, because it controls 50% of all healthcare dollars spent in the US. If medicine was a pure free market, then yes salaries would drop. But as it stands, government bureaucrats set salary rates, so there is no competition based on price. The medicare reimbursement is the same whether there are 10 doctors or 10 million doctors.
And why is that? We've tried at various times a "managed" economy briefly in the past, during the energy shocks of the '70s and WW-II and people always found a way to let the market work in the consumer market place. Even in the USSR and China, there have been underground economies that work around the managed economies of their days.
Remember Franklin said, "Our republic will endure only so long as our politicians do not discover that the people can be bribed with their own money." (at least I think it was Franklin). In this case, they are bribing the people with our money.