- Joined
- Jul 3, 2011
- Messages
- 12
- Reaction score
- 1
I apologize if this is not the right place for this post, but I was wondering if there are any actual doctors who graduated who are happy with their jobs. I have recently (in the past year) decided to go to medical school. I have a degree in business and have been in the work field for the past few years, but recently I have decided to go to medical school for a number of different reasons. Since I don't have any friends who are doctors I started doing a lot of research online to find out what the job is really like. I understand all the downsides of the job (or I think I do).. the crazy hours, all the bureaucracy, the faulty system, peer unfriendliness, ungrateful patients, insurance companies, etc.. Actually about 95% percent of what I read online is from doctors who hate their lives, hate their jobs, and feel like they are stuck in a career that is the only one that will pay for their immense loans. I've just started getting so depressed reading all these negative comments on the internet! My question is.. are there any doctors out there who actually love their job and are happy with their decision of going to medical school??? If so, what field are you in and how long have you been out of medical school for?
Thank you!!
Thank you!!