Insurance does nothing to really insure your health. It is intended to help protect your wallet. A better way to protect your health is to take care of yourself so you are healthy. The discussion in the US about "the healthcare crisis" always focuses on the financial piece of it. There are differences between Europe and the US that impact health having nothing to do with who pays the medical bills: Europe is generally more pedestrian-friendly, there is a different food culture and so on. All of those things impact health. I get tired of seeing "health insurance" held up as a) the only meaningful difference in health costs between the US and Europe and b) the only important part of "how to control medical costs" in the US debate on health care.
That's why I bring it up: Because to me the two things are related.