So I always thought that with becoming a doctor came becoming a 'professional.' But as I started to notice, everyone is a 'professional.' All job positions call themselves professionals. Professional writer, professional pest inspector, professional care giver. And the definition at dictionary.com seems to confirm this. I guess it seems to take away from what I perceived as being the professional world. Or am I being elitist? What do you all think about the word 'professional' as it applies to employment?