Not saying that it is, just asking. Here are some reasons why I sometimes think so: 1. all doctors (not surgeons) do is push pills to patients. basically, taking a history, exams, etc, is all done to figure out which medicine to prescribe. 2. drug companies throw money at doctors, trying to sway them to use their drug. why is it that college athletes get punished for getting a few gifts here and there, but it's ok for doctors to get free tickets to basketball games paid by drug companies? 3. doctors don't really know if a medicine will work. it's like "here take this, and tell me how you feel later. if something bad happens, i'll give you a different pill" 4. when all the drugs don't work, the doctors just send patients to surgeons to do the real work. I admit these reasons aren't well thought out and may be wrong, but when i asked my older sibling (who is a physician) if medicine is a scam, she tended to agree that it was a scam in a way. what are your thoughts?