Why all the sudden concern about public health in the US? (General)
The subject line says it all.
US ranks last in healthcare among 11 wealthiest countries despite spending most
Millions of Americans don't have health insurance and can't get basic health care. Millions more have health insurance that's essentially useless. The last time I lived in the US, I paid something like $650/month for a policy that had a $6,500 deductible for both myself and my spouse. $6,500. That means the policy, that I was paying nearly $8,000/year for, paid nothing until I had paid $6,500 out of pocket, and this does not take into account what my spouse may have had to pay.
And I was only making $60K/year! It's entirely possible under that plan that we'd have to shell out $21,000/year on health care before the insurance even kicked in, and then it only covered partial payments. So that's roughly 50% of my take-home pay.
So, basically, we didn't go see doctors because we'd have to pay out of pocket. The insurance was useless. It was there for catastrophic events.
And yet, now, the government seems to care so much about our health. Why is that? Why is it now so important that I get vaccinated for the well being of myself, my family, and my community? If I fear I have cancer or some chronic ailment such as high blood pressure or diabetes, why doesn't the government care, at all? Those ailments are far deadlier than covid.
But nobody is asking these questions in the media.