Advertisement - Members don't see this ad
I was wondering if there are any dental schools that pride themselves in their graduates working in rural areas. I know that in that some midwestern medical schools have rural seminars and I have been told they may even favor people from rural backgrounds that intend to return to rural area. Are there any dental schools that do similiar things? I am a small town kid who fully intends to return so I was just wondering.
