I was pretty set on dentistry, but after reading articles about how there is a huge possibility of corporate dentistry taking over makes my stomach churn. I wanna be my own boss. This has occurred in fields like optometry where now theres huge pop ups of wallmart optometry, costco optometry, etc... which has hurt the profession in my opinion. What is your guys thoughts on this matter? Do you think dentistry is still a lucrative and viable career option? Honestly, reading about this makes me want to change careers. What career would you suggest is really good that has good pay, autonomy, balance in lifestyle?