As you know, people on both sides are complaining. Doctors are complaining that people treat them like it's their obligation to treat every patient, regardless of cost to the doctor. Doctors feel that people go to the ER and take advantage of "free care", and some people even think doctors should treat them for free because "doctors shouldn't be in it for the money, they should want to help people". Likewise, many patients are complaining that medicine is becoming more and more of a business. That doctors and hospitals turn away unprofitable patients, and that healthcare should be free/paid for by taxes, like the fire department or police.
So, opinions?
So, opinions?