In preparation for interviews this fall I've been asked this question: What role should doctors play in health care reform? Now I've read a few articles and seen opinions all over the map, but still feel a bit lost. What do you think about this topic in response to that question? Are there pitfalls for being too political in your answer? Can medical professionals actually steer the national debate in a meaningful way? Do hospitals, HMO's and companies have strict protocols in terms of activism?
Thank you.
Thank you.