Is it really that bad? Are we really all looking down the barel of a rifel? Or are we all kind of exagerating the severity of Healthcare / America's future? I'm referring to the overall negative and hum-drum tone nearly everyone has about the future of anything in this country or health field. What's the deal!? Why is everyone acting like it's the end of the world and theres just no reason to continue to persue anything with the school loan bubble, the lack of quality education, the impending goverment downfall, the decreasing American dollar, our debt cieling. Isn't there always this negative stuff to deal with? And just because we are now the generation to deal with it, it feels like it's worse now than ever, isn't that just because its 'US' now that is dealing with it? Everyone always seems worse when it happens to you. We can look at starvation and feel sad about it, but if it happens to us personally it's an epidemic. Aren't we all being a little over the top? Maybe not... ?