This 1922 photograph shows an anesthesiologist preparing a patient for surgery. As medicine has evolved and improved over the years, costs have risen. (Source: U.S. Library of Congress)
Young Gordon Isaacs was the first patient treated with radiation therapy in 1957 for an eye tumor in his right eye. Cancer had previously taken his left eye. (Source: National Cancer Institute/Stanford University)
If you make exercise fun, you'll eat less after your workout, new research contends.More >>
If you make exercise fun, you'll eat less after your workout, new research contends.More >>
(RNN) - The practice of medicine became a profession in the 19th century as scientific advances broadened the amount of knowledge so much that specialization became a necessity. With advances in care, the medical field became a money-maker.
Before then, families were pretty much on their own, relying on contradictory advice from books and other sources of information, with marginal success. People struggled to survive, faced with a number of medical dangers, mostly infectious diseases.
Human costs of poor healthcare
Before germ theory, developed in the late 1800s, doctors didn't truly understand how diseases were transmitted. As a result, hospitals were initially established in the U.S. as a method of isolation, to keep infected people away from healthy ones.
According to the University of Pennsylvania School of Nursing, Benjamin Franklin helped found the nation's first treatment hospital, Pennsylvania Hospital, in 1751. Still, most middle- or upper-class people continued to be cared for at their own homes for most of the 19th century, with surgeries even performed at home.
Epidemics claimed scores throughout American history. For instance, a diphtheria epidemic in New England in the early 1700s killed about 2.5 percent of the population, including 30 percent of the children, according to a University of Oregon study. The disease remained a major killer of children, with outbreaks occurring at regular intervals until children began to be inoculated against the disease starting in the 1920s.
In the 1800s, the U.S. was subjected to various infectious disease outbreaks. Cholera, which is now a disease associated with developing countries, ravaged the U.S. between 1830 and 1851, 1866 and 1873, causing thousands of deaths.
The last major outbreak of polio occurred in 1952, when some 52,000 cases were reported, paralyzing more than 21,000 and killing 3,000 others.
"The late 1800s and early 1900s were full of medical advances, from the identification of infectious agents to the development of antitoxins, vaccines and new medical technologies such as X-ray radiography and blood pressure meters," according to theYale Journal of Medicine and Law.
Between 1865 and 1925, hospitals evolved into expensive, modern facilities as they began serving more paying middle-class patients and were subject to the associated financial pressures, according to the University of Pennsylvania School of Nursing report.
Hospital operating rooms, loaded with technology and skilled personnel, developed into the most ideal places for surgery.
Rise of insurance
A National Public Radio report stated that in 1900, people in the U.S. spent an average of $5 a year on healthcare, the equivalent of $100 in today's money. The low cost of healthcare made health insurance virtually unnecessary.
According to a 1918 Bureau of Labor Statistics survey of 211 families in Columbus, OH, families paid only 7.6 percent of their average annual medical costs toward hospital care.
The chief cost linked with illness was not the cost of the treatment. Rather, the sick suffered a loss of income from not being able to work. Therefore, most people who bought insurance purchased sickness insurance to replace lost income.
The earliest form of insurance took the form of accident insurance for railroad and steamboat travel in 1850, offered by the Franklin Health Assurance Company of Massachusetts, according to Yale Journal of Medicine and Law.
As innovations like antitoxins, vaccines, X-rays and blood pressure meters improved the quality of healthcare, the medical profession became more regulated, with governing bodies putting higher standards in place.
In 1904, the Council in Medical Education was created by the American Medical Association to construct standards for medical licensing, and in 1913, the American College of Surgeons was developed to start the accreditation of medical schools.
With rising standards and quality, the American public began to see the value of professional healthcare. Rising consumer demand, coupled with a limited supply of physicians and hospitals, raised prices.
According to a report by Melissa Thomasson at Miami University at the Economic History Association website, there were proposals for nationalized health insurance in the U.S. back in the 1920s, as many European nations had started some form of it. But the plans put forward by the American Association for Labor Legislation never took root in the states where it was proposed for a number of reasons, including still low demand for health insurance and opposition from physicians, pharmacists and insurance companies, all fearing that they would make less money.
One study during the 1920s noted that while healthcare costs were low for urban families if there were no hospitalizations, costs ballooned with hospitalization up to an average of $261 for families with incomes between $2,000 and $3,000 a year.
The prototypes of prepaid group health insurance began in 1910, with members paying a premium to receive medical service through selected providers, according to the Blue Cross Blue Shield website. Blue Cross plans started at Baylor University in Dallas in 1929, a nonprofit entity offering prepaid hospital care to a group of Dallas teachers. In 10 years, plan enrollment grew from more than 1,300 to 3 million.
Blue Shield plans, offering reimbursements for physician care, started in 1930. Physicians crafted the non-profit plans to protect their interests, Thomasson reported, warding off a perceived threat of Blue Cross entering the physician care insurance realm.
The tie between health insurance and employment became more manifest after the passage of the National Labor Relations Act in 1935, which, among other things, halted discrimination against union employees. Thus protected, labor unions had the strength to negotiate for health insurance, which began to be included in union contracts as valuable benefits for workers.
Limited by wartime wage controls during World War II, employers used health insurance as a way to sweeten the pot for potential employees, Yale Journal of Medicine and Law said, further cementing the tie between employment and health insurance.
Commercial health insurers followed in the footsteps of the successful Blue Cross and Blue Shield plans, catering to the American workforce. They were seen as the best market for health insurance because people who are employed are typically healthier and relatively young, according to Thomasson.
By 1955, 70 percent of Americans had some form of health insurance coverage, growing quickly from 10 percent in 1940.
During the 1950s through the 1970s, intensive care units grew and machines became prevalent in hospitals, according to the University of Pennsylvania School of Nursing.
With these technological advances came the need for more savvy nurses. As a result, nursing education moved from "three-year, hospital-based diploma programs to four-year baccalaureate programs in colleges and universities," the University of Pennsylvania School of Nursing stated.
Medicare and Medicaid
The focus on insuring employed Americans left a gap in insurance coverage for the poor and older Americans. In the 1960s, the government took action to address this need, with healthcare funded from payroll taxes, income taxes, trust fund interest and, for those with Medicare Part B, enrollee premiums.
Medicare and Medicaid passed in 1965, providing a healthcare safety net for the aged and the poor. In 1966, Medicare served 19.1 million senior citizens, and Medicaid served 10 million poor in the U.S.
Medicare expenditures grew with growing enrollment in the 1970s.
An effort to control costs in 1983 led to a change in the way the reimbursements occurred.
"Instead of reimbursing according to the 'usual and customary' rates, the government enacted a prospective payment system where providers according to set fee schedules based on diagnosis," the Yale Journal of Medicine and Law stated. This change helped keep expenditures level until eligibility requirements were broadened in the 1990s.
By 1999, the number of enrolled grew to 39.5 million served by Medicare and 37.5 million served by Medicaid. By 2001, Medicare and Medicaid accounted for 32 percent of all the U.S. healthcare expenditures.
Laws further refined the rights of patients and cost-management protocols, including the Healthcare Maintenance Organization Act of 1973. HMOs were originally meant to be nonprofit to slow the growth of healthcare costs, but became dominated rapidly by for-profit organizations, the Yale Journal of Medicine and Law stated.
Responding to the quickly growing cost of healthcare, President Bill Clinton proposed a universal healthcare system in 1993, which was rejected by Congress.
However, some reforms were enacted in that decade: The Health Insurance Portability and Accountability Act (HIPAA) in 1996, which was designed to safeguard patients' privacy, and the Children's Health Insurance Program (CHIP) in 1997, providing healthcare insurance to moderate-income children whose families don't qualify for Medicaid.
In recent times, healthcare in the United States has been hailed as the most expensive when compared to other first-world countries.
For instance, basic procedures and office visits cost more in the U.S. In 2012, a routine doctor visit costs an average of $30 in Canada, according to the International Federation of Health Plans. It costs an average of $95 in the U.S. Hospital care, drugs, doctors also costs more in the U.S. than in other countries.
Experts tout various theories on why this price inflation exists. An article in The Atlantic blamed the lack of governmental controls and the dependence on for-profit care for ballooning costs.
Matthias Rumpf of the Organizations for Economic Development and Cooperation cited, in a PBS interview, a variety of factors, including underdeveloped primary care, which has an adverse effect on people's health and raises the price of healthcare to the point that people without health insurance may face financial ruin if they get sick.
"It is far cheaper for a family doctor to check that people are following their treatment properly and that it is appropriate, than for things to go wrong and someone to be admitted to hospital as an emergency," Rumpf said. "Greater attention to the primary care system is urgently needed in the United States."
According to the Department of Health and Human Services, an average of 15 percent of Americans do not have health insurance. To counter this, in 2010, the Patient Protection and Affordable Care Act was signed into law. Among its reforms, it provides state-run and federal-run health exchanges where people may shop for insurance and offers subsidies as an incentive. It also ensures that people with pre-existing conditions can get healthcare.
Copyright 2013 Raycom News Network. All rights reserved.