Jump to content
IndiaDivine.org

OT: Medical Care in the USA

Rate this topic


Guest guest

Recommended Posts

Guest guest

Hey y'all,

 

This article is a reprint from the May/June 1992 issue of the American

Heritage Magazine. Its old .. but it is an interesting historical guide

to the evolution of American medicine. There are no references for the

stats given but I have no way of questioning them .. perhaps the writer

tweaked them a bit to justify his position. Conclusions at the end

appear logical but might not be .. because the article looks like a

well researched college term paper. Conclusions such as those presented

are often simple and logical recommendations to solve complicated and

totally illogical problems.

 

In any case .. I think its a long, but good read for most folks.

 

Y'all keep smiling. :-) Butch http://www.AV-AT.com

 

 

HOW AMERICA'S HEALTH CARE FELL ILL

 

As modern medicine has grown ever more powerful, our ways of providing

it and paying for it have gotten ever more wasteful, unaffordable, and

unfair. An explanation and a possible first step toward a solution.

by John Steele Gordon

 

Perhaps the most astonishing thing about modern medicine is just how

very, very modern it is. Ninety percent of the medicine being practiced

today did not exist in 1950. Just two centuries ago medicine was an

art, not a science at all, and people—whistling past the graveyard—joked

that the difference between English doctors and French ones was that

French doctors killed you while English ones let you die. Even sixty

years ago there was usually little the medical profession could do once

disease set in except alleviate some of the symptoms and let nature take

its course.

 

When the distinguished physician and author Lewis Thomas was a young

boy, in the 1920s, he often accompanied his physician father on house

calls, and his father would talk with him about the patients he was

seeing and the medicine he was practicing.

 

The new power to extend life, interacting with the deep impulse to stay

alive, has had consequences we are only beginning to comprehend.

 

" I'm quite sure, " Thomas wrote years later, that " my father always hoped

I would want to become a doctor, and that must have been part of the

reason for taking me along on his visits. But the general drift of his

conversation was intended to make clear to me, early on, the aspect of

medicine that troubled him most all through his professional life; there

were so many people needing help, and so little that he could do for any

of them. It was necessary for him to be available, and to make all

these calls at their homes, but I was not to have the idea that he could

do anything much to change the course of their illnesses. It was

important to my father that I understand this; it was a central feature

of the profession. "

 

But as Lewis Thomas prepared to enter medical school himself, this

age-old central feature began to fade away. Around 1930 the power of the

doctor to cure and ameliorate disease started to increase substantially,

and that power has continued to grow exponentially ever since.

 

One popular almanac gives a list of milestones in the history of

medicine. The list is eighty items long, stretching back all the way to

2700 B.C., but of those eighty milestones, twenty-nine were achieved in

the last sixty years. In other words, more than 36 percent of medicine's

most noteworthy triumphs have occurred in just the last 1.3 percent of

medicine's history. This new power to extend life, interacting with the

deepest instinctual impulse of all living things—to stay alive—has had

consequences that human society is only beginning to comprehend and to

deal with.

 

Some of these consequences, of course, are trivial. Perhaps the most

trivial is the disappearance of the house call itself. Dr. Thomas, Sr.,

could carry virtually the full armamentarium of the medicine of his day

in a single black bag, and very nearly all of medical knowledge in his

head. Today the general practitioner is a vanishing breed, and

specialists cannot use their time effectively traveling from patient to

patient. Other consequences, however, present some of the most profound

moral dilemmas of our time. Since ancient Greece, doctors have fought

death with all the power and passion at their disposal and for as long

as life remained. Today, while the passion to heal remains as great as

ever, the power has now become so mighty that we increasingly have the

technical means to extend indefinitely the shadow, while often not the

substance, of life. When doctors should cease their efforts and

allow—perhaps, in some cases, even help—death to have its always

inevitable victory is an issue that will not soon be settled, but it

cannot be much longer evaded.

 

Somewhere in between lies the problem of how to pay for modern medicine,

whose costs are rising faster than any other major national expenditure.

In 1930 Americans spent $2.8 billion on health care—3.5 percent of the

gross national product. That amounted to only $23 per person. The

average physician that year earned less than $10,000.

 

In 1990 the country spent 235 times as much on medical care, $666.2

billion. That amounted to $2,566 per person and 12.2 percent of the

gross national product. The average physician that year earned $132,550.

 

Netting inflation out of those figures, the country’s per capita medical

costs have risen ten times in sixty years while the average physician's

earnings have risen less than three times. There is no end in sight.

According to the Health Care Financing Administration, health

expenditures will almost triple in the next ten years if current trends

continue.

 

There are many reasons for this huge and accelerating increase in

medical costs. Consider just the last decade. In the 1980s medical

expenses in the United States increased 117 percent. Forty-three

percent of the rise can be accounted for by the general inflation the

country underwent, mostly in the early years of the decade. Ten percent

can be attributed to population changes, for the American population is

growing both larger and older. Fully 23 percent of the increased cost

went to pay for technology, treatments, and pharmaceuticals that were

simply not available in 1980, a measure of how very fast the science of

medicine is advancing. But that leaves 24 percent of the increase

unaccounted for. Unfortunately that remaining 24 percent is due solely

to an inflation peculiar to the American medical system itself.

 

It is clear that there is something terribly wrong with how health care

is financed in this country. Doctors, who are drowning in paperwork,

don't like it. Insurance companies, which write most of the checks for

medical care, don't like it. Corporations, whose employee medical

benefits now just about equal their after-tax profits, don't like it.

And the roughly thirty-seven million Americans who live in terror of

serious illness because they lack health insurance don't like it.

 

Is there a better way? How do we fund the research that has alleviated

so much human suffering and holds the promise of alleviating so much

more? How should society care for those who cannot provide for

themselves? How do we give them and the rest of us the best medicine

money can buy for the least amount of money that will buy it?

 

Finding the answers to these questions will be no small task and will

undoubtedly be one of the great political battles of the 1990s, for vast

vested interests are at stake. But the history of medical care in this

country, looked at in the light of some simple, but ineluctable,

economic laws, can, I hope, help point the way. For it turns out that

the engines of medical inflation were deeply, and innocently, inserted

into the system fifty and sixty years ago, just as the medical

revolution began.

 

That first medical milestone in the almanac, dating to 2700 B.C., is the

Chinese emperor Shen Nung’s development of the principles of herbal

medicine and acupuncture. (Emperors, it would seem, were more gainfully

employed back then than they are today.) But the practice of medicine

is far more ancient than that. And while rather a different line of work

is often called " the oldest profession, " medicine has at least an equal

claim.

 

Even the most technologically primitive tribes today have elaborate

medical lore, often intertwined with religious practices, and there is

no reason to doubt that earlier peoples had the same.

 

It was the Greeks—the inventors of the systematic use of reason that two

thousand years later would evolve into science—who first believed that

disease was caused by natural, not supernatural, forces, a crucial

advance. They reduced medicine to a set of principles, usually ascribed

to Hippocrates but actually a collective work. In the second century

after Christ, the Greek physician Galen, a follower of the Hippocratic

school, wrote extensively on anatomy and medical treatment. Many of

these texts survived and became almost canonical in their influence.

 

After classical times, therefore, the art of medicine largely stagnated.

Except for a few drugs, such as quinine and digitalis, and a

considerably improved knowledge of gross anatomy, the physicians

practicing in the United States at the turn of the nineteenth century

had hardly more at their disposal than the Greeks had had. In some ways

they were worse off, for while the followers of Galen believed in one

theory of disease—they thought it resulted from imbalances among four

bodily " humors " — early-nineteenth-century doctors were confronted with

dozens. One list drawn up as late as 1840 numbered no fewer than

thirty-eight schools of thought, including German Christian

Theosophists, Magnetizers, Exercisers, and Gastricists.

 

Needless to say, whenever there are that many theories, none of them is

likely to be much good. Indeed, partly because of this theoretical

confusion, there were no standards by which to judge the qualifications

of those who entered the profession. Anyone could become a " doctor, " and

many did. In 1850 the United States had 40,755 people calling

themselves physicians, more per capita than the country would have as

recently as 1970. Few of this legion had formal medical education, and

many were unabashed charlatans.

 

This is not to say that medical progress stood still in that time.

Indeed, there was more progress than there had been in two thousand

years. The stethoscope was invented in 1816. The world's first dental

school opened in Baltimore in 1839. But the most important advance was

the discovery of anesthesia in the 1840s. Until anesthesia, surgery was

extremely limited, and the surgeon's single most desirable attribute was

speed. The most skilled prided themselves on being able to amputate a

leg in less than one minute. But while anesthesia made extended

operations possible, overwhelming postoperative infections killed many

patients, so most surgery remained a desperate, last ditch effort.

 

Another major advance of the early nineteenth century was the spread of

clean water supplies in urban areas. This great " public health " measure

actually had at first little to do with public health and much to do

with the rising standard of living brought about by the Industrial

Revolution. But with clean water piped into the cities from distant

reservoirs and waste water disposed of by sewer systems, the epidemics

of waterborne diseases, such as typhoid and cholera, which had ravaged

cities for centuries, rapidly, if largely fortuitously, abated.

 

Still, for all the recent improvements in living standards, death was no

stranger even in the better parts of town. In the 1850s children under

five years of age accounted for more than half of all the mortality in

New York City, and it was common even for people in their prime suddenly

to sicken and die while no one knew why. Every day's newspapers

obituary column was filled with commonplace tragedies.

 

Physicians in the United States at the turn of the nineteenth century

had hardly more at their disposal than the Greeks had had.

 

McMahon—On Saturday, January 2 [1869], at half-past six P.M. Mary F.

McMahon, the beloved daughter of William and Sarah F. McMahon, after a

painful illness, aged 2 years, 10 months and 6 days.

 

Darling—Suddenly, on Friday, January 1 [1869], Clarence A., second son

of A. A. and G. W. Darling, aged 11 years and 2 months.

 

Reilly—On Friday, January 1 [1869], Margaret R., the beloved wife of

James W. Reilly, in the 21st year of her age.

 

Then finally, in the 1850s and 1860s, it was discovered that many

specific diseases were caused by specific microorganisms, as was the

infection of wounds, surgical and other. The germ theory of disease,

the most powerful idea in the history of medicine, was born, and

medicine the science was born with it. For the first time it became

possible to study individual diseases systematically, and one by one

their etiologies and clinical courses came to be understood.

 

Still, while there was now a solid scientific theory underpinning

medicine, most of its advances in the late nineteenth and early

twentieth centuries were preventive rather than curative. The fact that

serum from the relatively harmless disease cowpox could immunize a

person against the disfiguring and often deadly smallpox had been

empirically discovered in the eighteenth century by Edward Jenner. Now

Louis Pasteur and others, using their new knowledge of microorganisms,

could begin developing other vaccines. Rabies fell in 1885, and several

diseases once the scourges of childhood, such as whooping cough and

diphtheria, followed around the turn of the century.

 

Pasteur also demonstrated that heating milk and other liquids to a

little below boiling for a certain period killed most of the

microorganisms present in them and helped protect against a number of

diseases, including tuberculosis. When the pasteurization of milk was

widely mandated, beginning around the turn of this century, the death

rate among young children plunged. In 1891 that rate for American

children stood at 125.1 per 1,000. By 1925 it had been reduced to 15.8,

and the life expectancy of Americans began a dramatic rise.

 

In 1900 a newborn child could expect to live only 47.3 years. By 1920

it was 54.1 years, in 1940, 62.9, and by 1950 the average American lived

to the age of 68.2 years. Life expectancy rose more than 44 percent in

fifty years. Still, in the first three decades of this century, the two

great victories over disease were, first, the increasing understanding

of " deficiency diseases, " illnesses caused not by microorganisms but by

diets lacking in what are now called vitamins—a word coined only in

1912—and, second, the discovery of insulin, in 1921. Juvenile diabetes

had once killed swiftly, but with regular injections of insulin it was

possible for patients to live for many years and lead near normal lives.

 

Then a revolution in medicine began, changing it beyond recognition in

less than two decades. This revolution continues, and there is no end

in sight. Actually it is really four revolutions at once, each having a

powerful, synergistic effect on the other three.

 

The first is the pharmaceutical revolution. Sulfa drugs, introduced in

the 1930s, and ever-proliferating antibiotics, which began appearing in

the early 1940s, were able to attack pathogenic organisms within the

body. For the first time doctors had weapons to use against infectious

disease itself, not just against the symptoms such diseases produced.

Dreaded ills like pneumonia, once one of the leading killers of young

and old, now could often be controlled.

 

Vast new categories of drugs, including cortisone, psychotropic drugs,

and synthetic hormones, became available in ensuing decades. Today such

new technologies as recombinant DNA assure that the pharmaceutical

revolution will continue with undiminished vigor. Antibiotics remain

the most important drugs of the new medical era, however. They not only

have had a major impact on medicine but have transformed surgery as well.

 

Although the antiseptic surgical procedures pioneered by Joseph Lister

in the 1870s greatly reduced the incidence of postoperative infection,

they by no means eliminated it. Surgery, while it advanced rapidly

after Lister, remained to some extent a measure of last resort.

 

But now any postoperative infection could be dealt with promptly, and

antibiotics could be given prophylactically to surgical patients to

assure that no infection started. Thus surgeons became much more

willing to operate, and the number of medical conditions that could be

cured or ameliorated by surgery grew swiftly.

 

The heart-lung machine, first used in 1953, opened a great new territory

to surgeons, while lasers, miniaturization, fiber optics, and such

advanced materials as resins, plastics, and ceramics revolutionized

surgical technique, making it far less traumatic for the patient. The

surgeon had once been among the lowest on the totem pole of medical

science (English surgeons to this day are called mister, not doctor),

but the new weapons of surgery changed everything, and they swiftly

became the dashing heroes of modern medicine.

 

The second revolution is in microscopy. The light microscope, in use

since the seventeenth century, can enlarge no more than two thousand

times and essentially cannot produce images in three dimensions. The

electron microscope, widely available by the 1950s, can enlarge more

than a million times. This made it possible to investigate cellular

structure and metabolism in depth for the first time. Our understanding

of the basic machinery of life itself, and of precisely how disease

disrupts that machinery, took a quantum leap forward. The more recent

scanning electron microscope can produce three-dimensional images, and

it has now become possible to see structures at the atomic level and

even, miraculously, to manipulate those atoms one by one.

 

The third revolution is in chemical analysis. Whereas not long ago we

were limited to detecting chemicals at levels of parts per million,

today, thanks to such techniques as radioimmunoassay, gas

chromatography, and mass spectroscopy, our ability to detect chemical

substances has grown by several orders of magnitude. Now substances

both natural and introduced can be detected in body fluids and tissues

at levels of only a few parts per trillion. Our comprehension of

biochemistry, as a result, has become deeper and wider by an equal order

of magnitude.

 

The fourth revolution is in imaging. The stethoscope was the first

technological means doctors could use to learn what was going on within

a living body and thus diagnose disease using information from within.

The X-ray technology developed at the turn of the century increased this

ability considerably. But today CAT scanners, magnetic resonance

imaging, PET scanners, and ultrasound, all technologies made possible

only by computers, yield extraordinarily precise images of physical,

chemical, and electrical processes within the living body. It is no

exaggeration to say that the difference between the images obtainable

from X rays and those from magnetic resonance imaging is the difference

between the projection of shadow puppets and a Steven Spielberg movie.

 

One fundamentally important consequence of the germ theory of disease,

one not foreseen at all, was the spread of hospitals for treating the

sick. Hospitals have an ancient history, and India had an extensive

network of them as early as A.D. 400. Usually operated by charitable or

religious organizations, they were intended for the very poor,

especially those who were mentally ill or blind or suffered from dreaded

contagious diseases, such as leprosy. Anyone who could afford better

was treated at home or in a nursing facility operated by a private

physician.

 

Worse, until rigorous antiseptic and later aseptic procedures were

adopted, hospitals were a prime factor in spreading, not curing,

disease. Childbed fever, for instance, which killed as many as one

woman in five who gave birth in a hospital, was far less prevalent among

those who gave birth at home. Thus, until the last quarter of the

nineteenth century, hospitals were little more than a place for the poor

and the desperate to die, and in 1873 there were only 149 of them in the

entire United States. Fifty years later there were 6,830, and they had

become the cutting edge of both clinical medicine and medical research.

 

But hospitals had a financial problem from the very beginning of

scientific medicine. By their nature they are expensive to operate and

extremely labor-intensive. Moreover, their costs are relatively fixed

and not dependent on the numbers of patients served. To help solve

these problems, someone in the late 1920s had an idea: hospital

insurance. The first hospital plan was introduced in Dallas, Texas, in

1929. The rs, some fifteen hundred schoolteachers, paid six

dollars a year in premiums, and Baylor University Hospital agreed to

provide up to twenty-one days of hospital care to any of the rs

who needed it.

 

While this protected schoolteachers from unexpected hospital costs, in

exchange for a very modest fee, the driving purpose behind the idea was

to improve the cash flow of the hospital. For that reason the scheme

had an immediate appeal to other medical institutions, and it quickly

spread. Before long, groups of hospitals were banding together to offer

plans that were honored at all participating institutions, giving

rs a choice of which hospital to use. This became the model

for Blue Cross, which first operated in Sacramento, California, in 1932.

 

Although called insurance, these hospital plans were unlike any other

insurance policies. Until then insurance had always been used only to

protect against large, unforeseeable losses. But the first hospital

plans didn't work that way. Instead of protecting against catastrophe,

they paid all costs up to a certain limit. The reason, of course, is

that they were instituted not by insurance companies but by hospitals

and were designed, first and foremost, to guarantee a regular cash flow

and generate steady demand for hospital services.

 

A pharmaceutical revolution began in the 1930s and was followed by the

revolutions of electron microscopy, chemical analysis, and imaging.

 

The result was rather as if the local auto mechanic, in search of a

steady income, had offered, for $10 a month, to make any needed repairs

on your car up to $120 a year. That scheme works well if a windshield

wiper breaks, but it's not much help if your transmission blows up.

 

In the early days of hospital insurance, however, this fundamental

defect was hardly noticeable. Twenty-one days was a very long hospital

stay, even in 1929, and with the relatively primitive medical technology

then available, the daily cost of hospital care per patient was roughly

the same whether the patient had a baby, a bad back, or a brain tumor.

 

Today this " front-end " type of hospital insurance simply does not cover

what most of us really need insurance against: the serious, long-term

illness that can be diagnosed and treated only with very sophisticated

and expensive technology. In the 1950s major medical insurance, which

does protect against catastrophe rather than misfortune, began to

provide that sort of coverage. Unfortunately it did not replace the old

plans in most cases, but instead supplemented them.

 

The original hospital plans also contained the seeds of three other

economic dislocations, unnoticed in the beginning, that have come to

loom large. The first dislocation is that while people purchased

hospital plans to be protected against unpredictable medical expenses,

the plans paid off only if the medical expenses were incurred in a

hospital. Cases that could be treated on either an inpatient or

outpatient basis therefore became much more likely to be treated in the

hospital, in order to take advantage of the insurance. This, of course,

tended to maximize the use of hospitals, the most expensive form of

medical care.

 

The second dislocation was that hospital plans did not provide indemnity

coverage. Again, let's look at auto insurance, which indemnifies the

holder against loss. If a policyholder's car is wrecked, the insurance

company sends him a check for the value of the car, and the insured then

shops for the best way to be made whole. He might have the car

repaired, having looked around for the cheapest competent mechanic. He

might use the money as the down payment for a new car, selling the wreck

for what he can get. He might even put the money in the bank and walk

to work thereafter.

 

But medical insurance provides service benefits, not indemnification. In

other words, the insurance company pays the bill for services covered by

the policy, whatever that bill may be. There is little incentive for

the consumer of medical services to shop around. With someone else

paying, patients quickly became relatively indifferent to the cost of

medical care. This suited the hospitals perfectly because they

naturally wanted to maximize the amount of services they provided and

thus maximize their cash flow, smoothing out their financial problems.

Doctors liked this arrangement for precisely the same reason that

jewelry-store owners like vain women with rich and adoring husbands. If

patients are indifferent to the costs of the medical services they buy,

they are much more likely to buy more of them, even those that are of

marginal utility or duplicative. Why not? After all, the nice thing

about wearing both suspenders and a belt is that your pants hardly ever

fall down.

 

None of this is to libel doctors or hospital administrators, but only to

note their humanity. They pursue economic self-interest just like

everyone else, and arrangements that maximize income are always going to

be looked upon favorably by those whose incomes are maximized. One

result was that the medical profession began to lobby in favor of this

system. In the mid-1930s, for instance, as Blue Cross plans spread

rapidly around the country, state insurance departments moved to

regulate them and force them to adhere to the same standards as regular

insurance plans. Specifically they wanted hospital plans to set aside

reserve funds in order to handle unexpectedly large claims, a necessary

but expensive part of the insurance business.

 

Had hospital plans come to be regulated like other insurance companies,

it is likely that they would have begun acting more like insurance

companies and the economic history of modern American medicine might

have taken a very different turn. But they were not. Doctors and

hospitals, by and for whom the plans had been devised in the first

place, moved to prevent this from happening. The American Hospital

Association and the American Medical Association worked hard to exempt

Blue Cross plans from most insurance regulation, offering in exchange to

enroll anyone who applied and to operate on a nonprofit basis. The

Internal Revenue Service, meanwhile, ruled that these plans were

charitable organizations and thus exempt from federal taxes.

 

Because they were freed from taxes and the need to maintain large

reserve funds, Blue Cross and Blue Shield (a plan that paid physicians’

fees on the same basis as Blue Cross paid hospital costs) soon came to

dominate the market in health-care insurance, with about half the

policies outstanding by 1940. In order to compete at all, private

insurance companies were forced to model their policies along Blue Cross

and Blue Shield lines.

 

Thus hospitals came to be paid almost always on a cost-plus basis,

receiving the cost of the services provided plus a percentage to cover

the costs of invested capital. Any incentive for hospitals to be

efficient and thereby reduce costs vanished. In recent years hospital

use has been falling steadily as the population has gotten ever more

healthy and surgical procedures have become far less traumatic. The

result is a steady increase in empty beds.

 

Did the hospitals shrink in size as a result? Not nearly enough.

Because of the cost-plus way hospitals are paid, they don't compete for

patients by means of price, which would have forced them to retrench and

specialize. Instead they compete for doctor referrals, and doctors want

lots of empty beds to ensure immediate admission and lots of fancy

equipment, even if the hospital just down the block has exactly the same

equipment. The inevitable result, of course, is that hospital costs on

a per-patient-per-day basis have soared.

 

Doctors, meanwhile, came to be reimbursed for their services according

to what were regarded as " reasonable and customary " charges. In other

words, doctors could bill whatever they felt like as long as others were

charging roughly the same. The incentive to tack a few dollars onto the

fee became strong. The incentive to take a few dollars off, in order to

increase what in crasser circumstances is called market share, ceased to

exist. As more and more Americans came to be covered by health

insurance, doctors no longer even could compete with one another for

patients on the basis of price, let alone have an incentive to.

 

The third dislocation that lay hidden in the early hospital plans was

that they paid off for illness but not to maintain health. Imagine an

automobile insurance company writing a policy on a very expensive car,

guaranteeing to pay for any needed repairs by any mechanic of the

owner's choice, regardless of cost, but not requiring the owner to have

the car regularly maintained, inspected, or used in a prudent manner. If

the owner never changed the oil or checked the radiator hoses, if he

added dubious substances to the fuel—or to himself—to improve his

immediate driving pleasure, the company would have to pay for the

eventual results. Needless to say, such a company would be out of

business in short order. But that is precisely the way many health

insurance policies are written even today. The result in today's

high-tech, high-capacity, high-expense medical world is economic lunacy.

 

The cost of bringing a single seriously premature baby out of danger,

for instance, would pay for the prenatal care of thousands, sometimes

tens of thousands of babies and would prevent many such premature

births. But many poor parents cannot get prenatal care, so society

often has to spend as much as a quarter of a million dollars to rescue a

child from a tragedy that could have been prevented in many cases for

one-tenth of one percent of that sum.

 

Most company health plans in this country charge no more to insure

smokers and heavy drinkers than they do their more sensible co-workers.

Couch potatoes are insured for the same amount as their regularly

exercising friends. And because the system is weighted heavily in favor

of acute care, acute care has become where the money is and thus where

doctors tend to concentrate. Surgeons and surgical subspecialists earn

six or seven times as much as primary-care physicians (the family

doctor, in other words) in this country, even allowing for the

difference in skills and training. It is not hard to see why the United

States has more surgeons, per capita, than any other country in the

world and, no surprise, more surgery.

 

During World War II there arose another feature of the American

health-care system with large financial implications for the future:

employer-paid health insurance. With twelve million working-age men in

the armed forces during the war years, the American labor market was

tight in the extreme. But wartime wage and price controls prevented

companies from competing for the available talent by means of increased

salaries. They had to compete with fringe benefits instead, and free

health insurance was tailor-made for this purpose.

 

The IRS ruled that the cost of employee health-care insurance was a

tax-deductible business expense, and in 1948 the National Labor

Relations Board ruled that health benefits were subject to collective

bargaining. Companies had no choice but to negotiate with unions about

them, and unions fought hard to get them.

 

Businesses could now pass on a considerable portion of the cost of

health insurance to the government via the tax deduction. (The deduction

currently costs the federal government about $48 billion a year.) But a

private individual buying his or her own policy could not. Thus a

powerful incentive for employer-purchased health care was built into the

system. By 1950 as many as 54.5 million Americans were covered by

employer-provided health plans out of a population of 150 million.

The first hospital plans were designed not to protect patients but to

guarantee cash flow for hospitals and demand for their services.

 

But the company plan increased the distance between the consumer of

medical care and the purchaser of medical care by one more layer. When

individuals have to pay for their own health insurance, they at least

has an incentive to buy the most cost-effective available, given their

particular circumstances. But beginning in the 1940s a rapidly

increasing number of Americans had no rational choice but to take

whatever health-care plan their employers chose to provide.

 

There is another effect of employer-purchased health insurance,

unimagined when the system first began, that has had pernicious economic

consequences in recent years. Insurers base the rates they charge,

naturally enough, on the total claims they expect to incur. Auto

insurers determine this by looking at what percentage of a community's

population had auto accidents in recent years and how much repairs cost

in that community. That is why they charge someone living and driving

in Manhattan far more every year than they would someone living in

Frozen Sneakers, Iowa.

 

This system is known as community rating. Needless to say, the company

also looks at the driver's individual record, the so-called experience

rating. If someone has had four crackups in the last year, he will have

at the least a lot of explaining to do and is likely to be charged a far

higher rate—if he can get insurance at all. Most insurance policies are

based on a combination of the two rating systems.

 

But in most forms of insurance, the size of the community that is rated

is quite large, and this eliminates the statistical anomalies that skew

small samples. In other words, a person can't be penalized because he

happens to live on a block with a lot of lousy drivers. But

employer-based health insurance is an exception. It can be based on the

data for each company's employees, which are right at hand.

 

In order to compete with Blue Cross and Blue Shield, other insurers

began to look for companies whose employees had particularly good health

records and offer them policies based on that record at prices far below

what Blue Cross and Blue Shield—which by law had to take all

comers—could offer. This tactic, known in the trade as cherry picking,

concentrated the poor risks in the Blue Cross and Blue Shield pool and

began to drive up their premiums alarmingly. It also meant that small

companies had a much harder time buying affordable health insurance,

because a single employee with a poor health record could send their

experience rating way up. The effects of this practice are clear: 65

percent of workers without health insurance work for companies with

twenty-five or fewer employees. (Another consequence of cherry picking

is that many people with poor health are frozen in their jobs, for they

know they would never get health insurance if they left.)

 

By 1960, as the medical revolution quickly gathered speed, the

economically flawed private healthcare financing system was fully in

place. Then two other events added to the gathering debacle.

 

In the 1960s the federal and state governments entered the medical

market with Medicare for the elderly and Medicaid for the poor. Both

doctors and hospitals had fought tooth and nail to prevent " socialized

medicine " from gaining a foothold in the United States before finally

losing the battle in 1965. As a result of their over-my-dead-body

opposition, when the two programs were finally enacted, they were

structured much like Blue Cross and Blue Shield, only with government

picking up much of the tab, and not like socialized medicine at all.

Medicare and Medicaid proved a bonanza for health-care providers, and

their vehement opposition quickly and quietly faded away. The two new

systems greatly increased the number of people who could now afford

advanced medical care, and the incomes of medical professionals soared,

roughly doubling in the 1960s.

 

But perhaps the most important consequence of these new programs was the

power over hospitals that they gave to state governments. Medicaid

quickly made these governments by far the largest source of funds for

virtually every major hospital in the country. That gave them the power

to influence or even dictate policy decisions by these hospitals. Thus

these decisions were more and more made for political reasons, rather

than medical or economic ones. Closing surplus hospitals or converting

them to specialized treatment centers became much more difficult. Those

adversely affected—the local neighborhood and hospital workers’ unions—

would naturally mobilize to prevent it. Those who stood to gain—society

as a whole — would not.

 

The second event was the litigation explosion of the last thirty years.

For every medical malpractice suit filed in this country in 1969, three

hundred are filed today. This has sharply driven up the cost of

malpractice insurance, passed directly on to patients and their

insurance companies, of course. Neurosurgeons, even with excellent

records, can pay as much as $220,000 a year for coverage. Doctors in

less suit-prone specialties are also paying much higher premiums and are

forced to order unnecessary tests and perform unnecessary procedures to

avoid being second-guessed in court.

 

Because of all this, it followed as the night does the day that medical

costs began to rise over and above inflation, population growth, and

rapidly increasing capability. The results for the country as a whole

are plain to see. In 1930 we spent 3.5 percent of the country's gross

national product on health care. In 1950 it was 4.5 percent; in 1970,

7.3 percent; in 1990, 12.2 percent. American medical care in the last

six decades not only has saved the lives of millions who could not have

been saved before and relieved the pain and suffering of tens of

millions more but also has become a monster that is devouring the

American economy.

 

Is there a way out?

 

One possible answer, certainly, is a national health-care service, such

as that pioneered in Great Britain after World War II. In such a system

hospitals are owned by the government, and doctors and other healthcare

professionals all work for the government on salary. The great

advantage is that no one has to worry about paying for a major illness;

all are automatically protected.

 

But there are a number of disadvantages. Because medical care is free,

the demand for it always exceeds the supply. The only recourse in such

a system is to use the government's monopoly power to ration health care

and to hold down major costs such as professional salaries.

 

Thus in Great Britain people above a certain age who come down with

grave but expensively curable illnesses often are not cured, unless they

pay for their treatments privately. A twenty-year-old with lymphoma is

treated aggressively; a seventy-year-old is made comfortable. Someone

who comes down with an unpleasant but not fatal condition must wait to

reach the top of the list to be treated. Those who need their

gallbladders removed may have to endure a completely fat-free

diet—gastronomic purgatory—for as long as two and a half years while

they await their turn.

 

A second disadvantage is that because it is in the government's interest

to hold down salaries as much as possible, many would-be doctors are

dissuaded from entering the profession in the first place, and many more

emigrate to where remuneration is greater. Great Britain has one of the

lowest rates of doctors per hundred thousand of population of any

industrialized country. And that rate would be lower still if the

United Kingdom did not actively encourage the immigration of doctors

from Third World countries. One consequence is that medical research

suffers as first-class brains move out. Any system that by its nature

retards research in the age of the medical revolution is deeply flawed,

for it is at odds with the long-term interests of humankind.

 

In any event, it is very unlikely that a proposal for a national

health-care system could survive the political interests arrayed against

it in this country. Many other proposals to reform health care,

however, have been advocated in this presidential election year,

including one by President Bush, a sure sign that a serious search is

under way at last.

 

Like chicken soup, most of these proposals won't make matters worse, and

by eliminating cherry picking and other obvious abuses they can

certainly help. But none of them get to the real root of the problem:

the fact that most consumers of medical services in this country don't

pay directly for what they consume, and thus don't care what the medical

services cost.

 

Here is one very simple proposal that would go a long way to fix that.

It costs an American company about $4,500 a year to provide an

employee's family of four with medical insurance having a deductible of

$200. A major-medical policy that had a $3,000 deductible, however,

could be bought by the company for about $1,500. This policy would

protect the family against medical disaster. If the company then put

the remaining $3,000 into a " medical IRA " account for the employee's

benefit, and the employee, before retirement, could only spend the money

for medical care, you would have an immediate, powerful incentive to

care about what medicine costs. You would start asking questions.

Doctors and hospitals would have to start answering them.

 

A more elaborate idea, advocated by the Heritage Foundation (no relation

to this magazine), would work as follows. First, individuals rather

than companies should be the primary purchasers of health insurance.

That could be largely accomplished if the federal government eliminated

the corporate tax deduction for medical insurance and provided an equal

deduction—or better yet a tax credit for a portion of it—to individuals.

If companies were required to pass on in the form of salary the money

they had previously spent on health insurance, both the companies and

the employees alike would be left just as well off. The companies, in

fact, would be far better off, for they would be out of the health

insurance-purchasing business, with its mountainous paperwork. The

system offers little incentive for the consumer to shop around and

no reason for any doctor to take a few dollars off the bill.

 

But in exchange for the tax credit, the government should require

individuals to carry, for both themselves and their dependents, health

insurance of the major-medical variety that protects against catastrophe

and to note the companies and policy numbers on their 1040 forms. For

those with low incomes the tax credit should be refundable, so if the

credit is larger than the income-tax liability, the person gets a check

for the difference. For the very poor the government should pay for the

major-medical insurance and also provide primary-care insurance,

designed to emphasize preventive medicine. This would take away the

power of state governments to dictate hospital policy for political

purposes.

 

With individuals, instead of companies, buying insurance policies,

insurance companies would have to cater to individual needs and persuade

individuals to " self-insure " against the relatively minor medical

expenses that are now paid for with " front-end " policies. This could

have two important and immediate effects. First, it would make

individuals much more conscious of preventive medicine. It is now more

true than ever that an ounce of prevention is worth a pound of cure.

Giving people a reason to act on that truism would save a lot of money

and improve both the quality and the length of human life. The second

effect would be to make people far more conscious of the relative costs

of alternative remedies. Doctors and hospitals, at long last, would have

a powerful incentive to find ways to cut costs; they would have to in

order to maintain their patient loads. The enormous waste and

duplication now built into the system would begin to be wrung out.

 

A third reform would be to require insurance companies to indemnify

policyholders rather than pay service providers. The point, again, is

to encourage " shopping around. " To be sure, the laws of the marketplace

cannot work perfectly as a cost-control mechanism for the provision of

medical services. And certainly someone with a heart attack or a

severed major artery is in no position to drive a hard bargain or

indulge in comparison shopping. But emergency medicine amounts to only

about 15 percent of the medical services rendered in this country. The

other 85 percent of the time the marketplace could powerfully influence

the cost of medical care.

 

There would also be a great benefit to doctors if insurance companies

paid patients rather than them. Thousands of tons of paperwork would

disappear, as would the increasing tendency of insurance companies to

second-guess doctors in the hope of holding down costs. Doctors could

go back to healing the sick instead of clerking for insurance companies.

 

Adopting such reforms would certainly not produce a medical utopia. No

matter how well things went, there would still be those who could have

been cured and were not. The poor and weak would still be more likely

than the rich and powerful to fall into the cracks of the system. But

to insist on a perfect system in an imperfect world is futile. And

there would be many short-term winners and losers as the reforms were

phased in and began to have their effect. Good hospitals and doctors

would prosper as never before while mediocre ones would suffer. There

would undoubtedly be a rich harvest of horror stories on the local news.

 

But when the dust settles, American medicine would be much more

economically efficient. Thus there would be that much more money to

cure the sick, comfort the dying, and search for the medical miracles

that lie ahead in undoubted abundance. The cost of the economic

inefficiency built into the medical system five and six decades ago was

at least $66 billion in 1990 alone. That money today is simply being

wasted. That is why it is not only in our national self-interest to

find a better, more economically efficient system of health care but our

duty as moral beings as well.

 

John Steele Gordon’s article on the savings and loan crisis appeared in

the February/March 1991 issue.

 

Copyright 2006 American Heritage Inc. All rights reserved.

Link to comment
Share on other sites

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...