Was having a conversation with my friend about this the other day, so I was curious what other people thought.
Even though derm is one of the most competitive fields to get into, my impression was that it is one of the least "respected" fields in medicine for the general public.
Do you think that most of the public even knows that dermatologists are doctors? I wouldn't be suprised if some people thought of them the same way they think of a dentist.
What are your thoughts? How do you think they are thought of?
Even though derm is one of the most competitive fields to get into, my impression was that it is one of the least "respected" fields in medicine for the general public.
Do you think that most of the public even knows that dermatologists are doctors? I wouldn't be suprised if some people thought of them the same way they think of a dentist.
What are your thoughts? How do you think they are thought of?