Don't even worry about this. What you learn in undergrad will most likely not help you at all. Plenty of people open businesses and run it without any formal education. Besides, I don't think I know any business major who's graduated with college knows how to open let alone manage a business. It's something you will learn on your own and the real world practice is far different from what you will learn in your text book. Besides, most dentists hire managers to manage their businesses so they won't have to worry about all the bs like coworker drama, etc.