I've read some threads on people urging pre-meds to learn more about the negative aspects of medicine in America. Does anybody have any suggested readings to really learn what you'll have to deal with as a physician? (I've seen the recommended reading list, I could only spot a few books about this)