Global Healthcare
2 April 2012
The United States Healthcare System “Healthcare” is defined in the dictionary as the field concerned with the maintenance or restoration of the health of the body or mind. It is the diagnosis, treatment, and prevention of disease, illness, injury, and other physical and mental impairments in humans. Most countries around the world that have a healthcare system have decided that it is a “right” and that everyone should have it, no matter what. In the United States, we've not been about that. Our system of healthcare has been one of a privilege. If you have a job it usually comes with the benefit of healthcare to cover trips to the doctor’s office or hospital. Health insurance was originally devised to support catastrophic things such as hospitalizations or car wrecks. Nowadays it’s meant to cover the day-to-day maintenance of healthcare which has evolved over time. With growing technology and expensive medications it is almost impossible to survive without some sort of healthcare insurance. If you don’t have healthcare insurance and something bad happens to you does the emergency room turn you away? The answer is no. But, who pays for that person’s $2,000 overnight stay? The answer is all of us who are insured. That leaves Americans with the question: should healthcare in the U.S. be a right or a privilege? Though I consider myself mostly neutral I believe that healthcare in this country should be considered a privilege. First, let’s look at rights that all humans enjoy today: free speech, free religion, free peaceful assembly, etc. These are all things that humans are born with. We are not granted these rights; we simply have them. Corrupt governments can take these rights away, but nobody can just “give out” rights. Overall, rights are inherent within individuals. They do not involve the goods or services of other people. In my opinion, the American public is spoiled and needs to start recognizing
2 April 2012
The United States Healthcare System “Healthcare” is defined in the dictionary as the field concerned with the maintenance or restoration of the health of the body or mind. It is the diagnosis, treatment, and prevention of disease, illness, injury, and other physical and mental impairments in humans. Most countries around the world that have a healthcare system have decided that it is a “right” and that everyone should have it, no matter what. In the United States, we've not been about that. Our system of healthcare has been one of a privilege. If you have a job it usually comes with the benefit of healthcare to cover trips to the doctor’s office or hospital. Health insurance was originally devised to support catastrophic things such as hospitalizations or car wrecks. Nowadays it’s meant to cover the day-to-day maintenance of healthcare which has evolved over time. With growing technology and expensive medications it is almost impossible to survive without some sort of healthcare insurance. If you don’t have healthcare insurance and something bad happens to you does the emergency room turn you away? The answer is no. But, who pays for that person’s $2,000 overnight stay? The answer is all of us who are insured. That leaves Americans with the question: should healthcare in the U.S. be a right or a privilege? Though I consider myself mostly neutral I believe that healthcare in this country should be considered a privilege. First, let’s look at rights that all humans enjoy today: free speech, free religion, free peaceful assembly, etc. These are all things that humans are born with. We are not granted these rights; we simply have them. Corrupt governments can take these rights away, but nobody can just “give out” rights. Overall, rights are inherent within individuals. They do not involve the goods or services of other people. In my opinion, the American public is spoiled and needs to start recognizing