Don’t Thank Big Government for Medical Breakthroughs

New cures come from private research, not cash dumped into the National Institutes of Health.

  • Save

Americans who want better treatments for their diseases should be pleased that the lame-duck Congress passed the 21st Century Cures Act, which will promote medical innovation. They should be wary, however, of the $4 billion budget boost that the law gives to the National Institutes of Health.

The assumption seems to be that the root of all medical innovation is university research, primarily funded by federal grants. This is mistaken. The private economy, not the government, actually discovers and develops most of the insights and products that advance health. The history of medical progress supports this conclusion.

Few findings in medical science significantly improved health until the late 19th and early 20th centuries. During that period came breakthroughs such as anesthesia and antisepsis, along with vaccines and antibiotics to combat infectious diseases. The discovery of vitamins and hormones made it possible to treat patients with deficiencies in either category.

In America, innovation came from physicians in universities and research institutes that were supported by philanthropy. Private industry provided chemicals used in the studies and then manufactured therapies on a mass scale.

Things changed after World War II, when Vannevar Bush, who had led the U.S. Office of Scientific Research and Development during the war, persuaded Congress to increase federal subsidies for science. The National Institutes of Health became the major backer of medical research. That changed the incentives. Universities that had previously lacked research operations suddenly developed them, and others expanded existing programs. Over time these institutions grew into what I call the government-academic biomedical complex.

In Case You Missed It:  Survey: 33% of Democrats to vote for RFK Jr. if he runs independently – DNC headed for PANIC

Since then, improvements in health have accumulated. Life expectancy has increased. Deaths from heart attack and stroke have radically decreased, and cancer mortality has declined. New drugs and devices have ameliorated the pain and immobility of diseases like arthritis. Yet the question remains: Is the government responsible for these improvements? The answer is largely no. Washington-centric research, rather, might slow progress.

Many physicians have never lacked motivation to develop treatments for diseases. But the government-academic biomedical complex has recruited predominantly nonphysician scientists who value elegant solutions to medical puzzles—generally preferring to impress their influential peers rather than solve practical problems. Vannevar Bush believed that basic research, unrelated to specific ends, was the best approach to scientific progress. How something works became more important than whether it works. Aspirin, for example, came into use even though researchers weren’t sure exactly what made it effective. That approach would never work today. Instead of the messy work of studying sick patients, scientists now prefer experimenting with inbred mice and cultured cells. Their results accrue faster and are scientifically cleaner, but they arguably are less germane to health.

Practical innovation requires incremental efforts. But the reviewers of grant applications for medical research are obsessed with theory-based science and novelty for novelty’s sake. They find incrementalism mundane. Consistent with that attitude, a 2003 review published in the American Journal of Medicine found that of more than 25,000 publications in prominent biomedical journals, only 100 even mentioned a medically relevant application of the research.

Academic administrators, operating under the delusion that government largess would grow forever, have become entitled. But since the 1980s, funding for the National Institutes of Health has lagged far behind the growth of an aging population in need of medical innovation. The extra $4 billion in the 21st Century Cures Act will have little effect on that financial gap.

In Case You Missed It:  The U.S. is in the throes of a total financial collapse, followed by the Mark of the Beast – WATCH as Jeffrey Prather explains

Today, researchers compete for government grants at increasingly shorter intervals and with diminishing chances of success: Less than 1 in 5 grant applications succeeds. This inhibits risk taking.

By contrast, private investment in medicine has kept pace with the aging population and is the principal engine for advancement. More than 80% of new drug approvals originate from work solely performed in private companies. Note that such drug approvals come on average 16 years after the beginning of clinical trials, which typically cost $2.5 billion from start to finish. Even if grant-subsidized academics wanted to create a new drug, economic reality prevents it.

Despite its exaggerated role, basic research in universities does advance human knowledge, train scientists, and contribute to medical advances—albeit uncommonly and inefficiently. But the system is unsustainable. A better approach would be to encourage academics to join with industry, where the financial resources and drive to innovate reside. Unfortunately, the biomedical complex demonizes corporations. If academic institutions stopped demeaning the activities needed to develop medical products, industry might take a greater interest in supporting their research.

Great advances in health care have been made, but there are still important challenges, from obesity to dementia. One step toward addressing them would be for Washington to adopt the right approach to medical innovation—and to stop simply throwing money at the current inefficient system.

Dr. Stossel, a visiting scholar at the American Enterprise Institute and professor emeritus at Harvard Medical School, is author of “Pharmaphobia: How the Conflict of Interest Myth Undermines American Medical Innovation” (Rowman Littlefield, 2015).

Posted in Freedoms and tagged , , , .