The Truth About Health Care for Veterans

I’m  always looking for new (or unnoticed) healthcare blogs that might interest  HealthBeat readers. Recently, I discovered “What If” (American Had a HealthCare  System That Worked).

The  blog is run by Georgia Berner, (founder and Director) and Emily Cleanth  (content manager/ researcher). Berner, who  owns a small to medium size company in Western  PA where she pays the entire health insurance costs for her 60 employees, has  become familiar with the inequities in our system both as an employer, and by  talking to voters while running for U.S. Congress in 2006.  Cleanth has a Master’s in Public  Policy and Management from Carnegie Mellon.

Not long ago, “What If” took a look at healthcare  for Veterans. I have reprinted the post below.

I  would add only that, since 2000, funding for the VA system has fallen far  behind the needs of returning troops and veterans. In the 1990s, the VA was  overhauled and became a very good health care system. I’ve written about it  here (The VA should not be confused with Walter Reed  hospital, which is run by the army. The Veteran’s Administration oversees the  VA.) But over the past eight years, funding has not kept up with the needs of badly  wounded vets returning from Iraq. Meanwhile, Vietnam vets are aging. This has led to impossibly long lines  and, in some cases, has meant that the VA has not been able to hire and retain  the medical staff that they need.

“What If “points out that conservatives have  protested increased funding for the VA, pointing to and explosion in  “entitlement programs.” Like Berner and Cleanth, I believe that veterans are  fully “entitled” to timely, high quality care.

I also agree that the cost of healthcare for  wounded troops should be included in the cost of the war. Why doesn’t it show  up as part of the total cost now? “Because,” they point out, “it essentially  doubles the cost of the war in Iraq.”

Shellshocked:  Veterans Health Care

Originally posted on “Whatif” . . . (American Had a HealthCare System That  Worked)  

According to polling in the past few months, the biggest issues troubling  Americans are health care and the war in Iraq.  What gets talked about less often is the point where these two issues  intersect. . .

Around 12% of the 47 million uninsured people  in the United    States are  veterans and their families: this adds up to 1.8 million  uninsured veterans. These 1 in 8 veterans are typically  45-year-old men who worked in the past year and are earning from $30,000 to  $40,000. Almost two-thirds of uninsured veterans were employed, and nearly 9  out of 10 had worked within the past year.

Why are they  uninsured?

  Defense Department data released in late 2007  show that thousands of National Guard and  Reserve members who had to give up civilian jobs when they were  deployed overseas have now permanently lost these jobs and with them their  health insurance, pensions, and other benefits. (Federal laws are supposed to  protect them from being penalized for leaving civilian employment for wartime  service.)

Continue reading

Can Big Tobacco Snuff Out Health Care Reform?

On Monday the New York Times ran a nice story detailing how Massachusetts is the newest in a long line of states hoping to fund health care initiatives by raising tobacco taxes. The report notes that “bills to raise tobacco taxes have been active in 22 state legislatures in 2008, according to the Tobacco Merchants Association, a trade group. That follows a year in which 11 states enacted increases, according to the National Conference of State Legislatures.” In other words, taxing cigarettes to fund health care reform is an increasingly popular strategy amongst policymakers across the nation.

Big Tobacco is not happy about this. The industry is putting a lot of time, effort, and money into snuffing out health care reform proposals—at both the state and national level—that rely on tobacco tax hikes for funding. In doing so, tobacco companies are torpedoing one of the few politically feasible strategies for raising funds needed to pay for reform.

Consider California. The Times notes that the state’s recent bipartisan plan for instituting universal health care, endorsed by Republican Gov. Arnold Schwarzenegger and Democratic Assembly Speaker Fabian Núñez, “died in the State Senate in January partly because of opposition to the $1.50-a-pack increase it included.”

This wasn’t the first time cigarette taxes have been an issue in California. In 2006, California voters turned back a ballot initiative, Proposition 86, that proposed to increase the cost of a cigarette pack by $2.60. Supporters of the proposition estimated proceeds from the tax at $2 billion—which would have been used to help fund health care reforms—and forecasted a $16.5 billion long-term decline in health care costs thanks to reductions in smoking. Good stuff.

Continue reading

Health Care Reformers Debate the Road to Universal Coverage, Part I

Can we reach a consensus on what we need to do to achieve meaningful health care reform in the U.S.?

This week, I have been mulling over The American Prospect’s May 2008 Special Report: The Path to Universal Health Care At first glance, it might seem that the eight articles in the report take eight different roads to reform. But I’m glad to see agreement on many pivotal points.

Yet there are still major issues that could divide reformers:  Should we acknowledge that we won’t be able to cover everyone unless we learn to “control costs”? Should we move directly to a single-payer system?  And finally, should we try to move quickly, to cover everyone, or should we aim for incremental progress while sticking, stubbornly, to first principles? 

In the months ahead, I think it is crucial that would-be reformers try to hash out their differences on these issues and unite under a single banner. Only then, can we divide opponents who have billions invested in preserving the status quo.

With that in mind, I decided to weave together some of the strongest insights in the Report—focusing on recurring themes—while also addressing the areas where reformers remain divided.

First, as I wrote two weeks ago on TPM Café the high and rising cost of health care may be the greatest obstacle to health care reform.  In the American Prospect report, Ezra Klein captures the problem with brilliant simplicity at the very beginning of his piece: “If health insurance were cheap, we could all buy it. If universal health care could get 60 votes in the Senate, we’d all have it. But these two imperative—tthe need to control costs and the need to attract the 60 Senate votes required to overcome a filibuster—point in opposite directions. This is the central paradox of health reform.”

Congressmen are loath to vote for a plan that would rein in spending for two reasons. First, they fear that if they call for “cost-control,” voters will hear “rationing.”  Secondly, they know how lobbyists will react to any attempt to cut the waste in our bloated healthcare system. One man’s hazardous waste is another man’s income stream.

Continue reading

Evidence-Based Mental Health Treatments: Lost in Translation

Earlier this month I attended a conference sponsored by the MacArthur Foundation spotlighting the intersection of mental health and public policy. In introductory comments for the event, Howard Goldman, Professor of Psychiatry at the University of Maryland School of Medicine and director the MacArthur Foundation’s Network on Mental Health Policy Research, called mental illness “an overlooked crisis.” He’s right. But contrary to what you might expect, it’s not just the public which overlooks mental health. Medical practitioners are often slow to adopt well-researched, proven mental health interventions—because they’re rarely profitable.

This is bad news for America, because mental illness is a big problem. Goldman noted that 38 percent of Americans who receive Social Security Disability Insurance have a mental disorder, as do more than 200,000 adults in prison and 30 percent of the nation’s homeless.

But mental illness isn’t just an issue for the have-nots. Jon Fanton, Ph.D., President of the MacArthur Foundation, addressed the conference after Goldman and offered some compelling numbers on the fiscal impact of poor mental health. Every year, said Fanton, mental illness amounts to $80 billion in indirect costs (lost productivity due to illness, premature death, and losses for incarcerated individuals and for individuals providing family care) and another $99 billion in the direct cost of providing care. Fully half of children with mental illnesses drop out of school.

For these reasons, said Fanton, the “interests of those in trouble are not in contrast to the interests of society and or the rest of us”—though “we tend to think…[that] the opposite is actually true.” In the end, mental health is a very public concern.

Continue reading

Just How Secure Is Your Employer-Based Health Insurance?

Last week, the Economic Policy Institute released a disturbing report revealing just how many white-collar workers have lost their employer-based health insurance in recent years—even though they didn’t change jobs.

Many workers believe that if they hold onto their job, their insurance is safe. Professionals with jobs near the top of the occupational ladder are especially likely to assume that their employer is not going to cut their coverage. That may well have been true in the 1990s, when the job market was tight—but not today.

The EPI report shows that in just the first six years of this century, the share of U.S. workers with employer-provided health insurance (EPHI) fell from 51.1 percent to 48.8 percent.  Moreover, workers in white-collar occupations—including executives, managers, and workers in professional specialties—were just as likely as blue-collar workers to lose their safety net.

Perhaps this shouldn’t come as a surprise, since employers typically pay a much larger share of premiums for higher-income employees, as I discussed on HealthBeat last month.  So as insurance premiums soar (up 78 percent since 2001), employers are beginning to chafe under the very costly burden of providing first-class benefits to white-collar employees. (Insurance premiums rose “only” 6.1 percent in 2007, but going forward, experts expect sharper increases because the cost of medical technology continues to skyrocket).

Most employers will just shift more costs to employees in the form of higher co-pays and deductibles. But some will decide that they cannot continue to offer insurance.

Continue reading

Why We Don’t Need More Doctors

In a society hooked on growth, “enough” just isn’t part of our vocabulary. Thus, the latest issue of The New England Journal of Medicine reports: “Despite the fact that there are now more physicians per capita in the United States than there have been for at least 50 years, the Council on Graduate Medical Education (COGME) recently predicted a 10% shortfall of physicians by 2020…The Association of American Medical Colleges has responded with calls for a 30% expansion of U.S. medical schools and a lifting of the current cap on Medicare funding for graduate medical education so that federal dollars can support the expansion of the workforce.”

Do we really need more doctors?  The boomers are aging and we’ve been told that this will lead to a huge spike in the need for health care. But as I explained just last month, the boomers will age, just as they were born, gradually, over decades.

Exhbit4

And while the boomers are, indeed, a demanding group, the fact is that many boomers have taken quite good care of themselves. It all began twenty-five or thirty years ago when they quite smoking and switched from red meat to fish; from scotch to white wine, from tanning to jogging. In the next decade or two, they just won’t need as much healthcare as their parents did at the same age.

Moreover, as the article’s authors, Dr David C. Goodman and Dr. Elliott S. Fisher, point out: “Physician supply varies dramatically by region of the country. COGME is concerned about a 10% shortfall at a time when the regional supply of physicians varies by more than 50%.” In other words, while more doctors may be needed in some states, in other places we have more than enough, thank you.

Continue reading

Health Wonk Review

This week, Health Beat is hosting “Health Wonk Review,” a biweekly round-up of the best of health policy blogs. Below, snapshots of posts that we found particularly interesting.

–Maggie Mahar and Niko Karvounis

News about Docs

In the past, Roy Poses has posted on Health Care Renewal (here and here) about the little-known fact that medical schools often fail to pay or otherwise reward faculty to actually teach.

Poses, who has a sharp nose for the ironies of our healthcare system, asks a fair question: “Why are medical school faculty expected to teach in their spare time, and spend their working hours …bringing in large amounts of what is euphemistically called ‘external support’” (a.k.a. $$$$) — while faculty in other schools are actually paid for teaching and other academic activities?

In his most recent post on the topic, Poses points to a story from the (Tucson) Arizona Daily Star reporting that University of Arizona Medical College Faculty are “On the Verge of Desperation.”

“Maybe it has something to do with having to do 10 hours a day of clinical work to bring in ‘external funds,’ and then being expected to teach,” Poses speculates. It turns out that U. of Arizona medical faculty are actually supposed to do three things:  provide high-quality teaching, care for a full load of patients (thereby bringing in the money),  and build a competitive research program—all at the same time.

“This looks like another case of mission-hostile management at a well-known medical school,” Poses observes, “albeit one that is probably representative of problems around the US.” 

Offering us a glimpse of how medical students see the world Scott Shreeve of Crossover Health grapples with the crisis in primary care. Speaking first-hand from his experience as a medical student at the University of Utah, Shreeve notes that there were efforts to convince students that primary care was the way to go, but that “both the message and the messengers were unconvincing.”

Why? Because, as many of us tend to forget, doctors are what economists call “rational actors” just like the rest of us. (Other social scientists have some doubts on this matter, but that’s another post).

According to Shreeve, decisions about specializing “came down to what specialty can provide the best outcome in terms of attaining the quality of life, financial security, and career stability students desire  at a price they are willing to pay in terms of years of training, lifestyle, and financial considerations.

“In the end,” says Shreeve, “the current financial system we have in place creates overwhelming incentives to go into a specialty.”

What Shreeve says about weighing lifestyle issues against costs makes sense. But is anyone else distressed that having an intellectual interest in a certain part of the body (the brain, for example), a particular disease (cancer, perhaps) or being drawn to a particular type of patients (children, for instance, or the elderly) never seems to come up as a factor in the decision-making process? 

Continue reading

Cost-Shifting, Gouging and Peddling False Hopes

No doubt some of you saw Monday’s story in the New York Times reporting that health insurance companies have begun forcing very sick patients to pay as much as 20 percent to 33 percent of the cost of some of priciest drugs on the market. As a result “patients may have to spend more for a drug than they pay for their mortgages, more in some cases, than their monthly income.”

But this isn’t just a tale about insurers letting patients down. As with so many health care stories, there is more than one villain in this piece, and plenty of blame to spread around.

First, let me re-cap the background to the Times’ report.  Normally, insured patients are not required to shell out a percentage of what a drug costs. Instead, they just chip in a flat co-pay of $10, $20, or $30, depending on whether the insurer lists the drug on tier 1, tier 2 or tier 3. But now insurers have invented a new category—tier 4—for hundreds of super-expensive drugs used to treat diseases like multiple sclerosis, hepatitis C and some cancers. Since some of these drugs can cost $100,000 or more a year, a 33 percent co-pay can add up to $33,300.

Insurers began this practice with the infamous Medicare Advantage plans that we wrote about on HealthBeat the other day. Eight-six percent of Medicare Advantage (MA) plans now require whopping co-pays for tier 4 drugs, so read the fine print before signing up for an MA policy. And now, tier 4 is showing up in insurance policies that people under 65 buy on their own, or through their employers. This is the fastest growing segment in private insurance, Dan Mendelson of Avalere Health, a research organization in Washington, told the Times.

Continue reading

The Difficulty of Persuading Physicians to Change the Way They Practice Medicine Based On Medical Evidence (Or, Why Ignaz Semmelweis Went Mad)

The first controlled medical trial in world history was staged in 1601 by a British ship captain named James Lancaster, reported Professor Ajit Lalvani, head of the Infectious Disease department at the Imperial College Healthcare in London, at the World Health Care conference last month.

Four ships were traveling from England to India around the Cape of Africa, Lalvani explained, when Lancaster decided to give the sailors on his ship three teaspoons of lemon juice every morning. No doubt Lancaster was acting on reports that had come from Sir Richard Hawkins during a voyage to the South Pacific, eight year earlier, when Hawkins had observed that the most effective protection against scurvy appeared to be “sour oranges and lemons.”

As the chart below shows, the hypothesis proved correct: by the end of the voyage, 40 percent of the sailors on the three ships where no one received their Vitamin C had come down with scurvy and perished. On Lancaster’s ship, no one died of the disease.

Lancaster

Continue reading

The High Cost of Medicare Advantage

This post was written by Maggie Mahar & Niko Karvounis

On Monday, the Bush Administration announced that next year payments to private insurers who offer Medicare to seniors will rise by 3.6 percent. This is a mistake. The last thing that the Medicare Advantage (MA) program needs is more money thrown at it. Indeed, MA has turned out to be a money-eating monster—in large part because the government gave it a blank check when the program was born, under the cover of darkness, in 2003.

It’s worth pausing to remember this breech birth. The Medicare Prescription Drug, Improvement, and Modernization Act (also known as the Medicare Modernization Act) came to the House for final approval at 3:30 a.m. on November 22, 2003.  It was losing, 219-215, until the House Leadership, in a very unusual move, held the vote open for hours while the Leaders twisted arms. At 5:50 a.m. the legislation passed the House 220 to 215.

Representative Nick Smith later claimed that he was offered campaign funds for his son, who was running to replace him, in return for changing his vote from "nay" to "yea. " He subsequently recanted this statement. Nevertheless, the House Ethics Committee and the FBI launched investigations into whether members of the House had in fact offered  Smith a bribe to vote for the measure.

In October 2004, the Committee  issued its report, revealing that “Majority Leader Tom DeLay  admitted that he offered to endorse Smith’s son Brad, who was running for Congress at the time, in exchange for Smith’s "yea" vote on the Medicare bill,” though the investigation couldn’t find out who offered Smith the money.

Continue reading