- Joined
- Oct 30, 2011
- Messages
- 260
- Reaction score
- 37
I'm looking for some good novels (fiction or non-fiction) that demonstrate the nature and spirit of the medical profession. More specifically, novels that demonstrate why doctors love the career, and in turn, what sort of people should consider a career in medicine.
EDIT: "this is what it's like to be a doctor" books would be great.
– Thanks!
EDIT: "this is what it's like to be a doctor" books would be great.
– Thanks!
Last edited: