The Danger of For-Profit Hospices

Maggie recently wrote about a MedPAC decision aimed at stopping for-profit hospices from purposefully keeping patients under their care for extended periods of time in order bill Medicare for more days of service. Medicare’s concern that the hospices might be bilking the system raises a larger question: should we be worried that so many of today’s hospices are for-profit?

The short answer is “yes,” and at the very least, we should be giving them—and hospices in general—more attention. Hospices play a bigger role in our health care system than ever before. In 2005, hospices cared for 1.2 million patients, and one-third of Americans who passed away that year did so under hospice care. According to the National Hospice and Palliative Care Organization (NHPCO), hospice admissions are rising rising at a rate of almost 10 percent a year.

The fastest growing segment of the hospice industry is—you guessed it—for-profit hospices. Between 1994 and 2004, the number of for-profit hospices in the US increased nearly 4-fold, growing more than 6 times faster than nonprofit hospices. According to industry estimates, for-profit hospice programs now care for about 35% of hospice patients, versus a mere 9% in 1990 (today,  nonprofit groups care for 56% and the government and other types of organizations care for the remaining 9%). Little wonder, given that there’s so much money to be made in the industry:  Medicare reimbursement for hospice care has grown from $68.3 million in 1986 to $8.3 billion in 2005 and is expected to hit a whopping $45.6 billion by 2030.

The bad news is that, in their quest for Medicare dollars, for-profit hospices don’t provide all the care that they should in order to fulfill the hospice mission of maximizing patients’ quality of life. In fact, a 2004 Medical Care study of 2,080 patients enrolled in 422 hospices across the country found that “terminally ill patients who receive end-of-life care from for-profit hospice providers receive a full range of services only half the time compared with patients treated by nonprofit hospice organizations.” That’s because for-profit hospices like to keep costs low by skimping on services, particularly so-called “non-core” services like medications and personal care (How these are classified as “non-core” services I don’t know—they seem pretty important to me—but there you go). For example, families of patients receiving care from a for-profit hospice received counseling services, including bereavement counseling, only [45% as often] as those in a nonprofit hospice. Translation: when researchers controlled for differences across patients, sicknesses, and conditions, those at for-profit hospices were only half as likely to get the same support provided at nonprofit hospices. A 2005 follow-up study confirmed that for-profit patients receive a “narrower range of services” than nonprofit patients.

Given these results, the senior author of both studies, Dr. Elizabeth Bradley of Yale, concludes that “for-profit hospices…might not be as strongly rooted in…[the]…traditional hospice philosophy” of “psychosocial support, spiritual care, the use of volunteers and family, and symptom management” as their nonprofit counterparts.

Continue reading

53 COMMENTS SO FAR -- ADD ONE

The War Against Tobacco Slows

This post was written by Maggie Mahar and Niko Karvounis

2007 marked the first time in 50 years that less than 20% of Americans smoked.  This is the good news. The bad news is that, just as the battle against smoking has entered what may be its most critical, final phase, support for that battle has waned among policymakers—even though the problem is far from solved.

Tobacco use, especially cigarette smoking, continues to be the leading cause of preventable diseases in the United States. It is blamed for 435,000 premature deaths in this country each year, and it adds more than $75 billion to annual spending on health care, according to the federal Centers for Diseases Control and Prevention.

Consider the raw numbers: in 2007, an estimated 19.8% (43.4 million) of US adults were still smoking cigarettes; of these, 77.8% (33.8 million) smoked every day, and 22.2% (9.6 million) smoked some days.  That’s a lot of smoke.

Break down the demographics and you find stark patterns. Smokers are likely to have less education than other Americans: CDC research has found that adults who have a GED diploma (44.0%) and those with 9–11 years of education (33.3%) are most likely to use tobacco.  Americans with an undergraduate or graduate degree are least likely (11.4% and 6.2%, respectively). Poorer people also are more likely to smoke: 33% of U.S. adults living below the poverty level are smokers while only 23.5% of those living above that level still light up.

Given how expensive cigarettes are these days, these are striking statistics. Why do low-income people smoke? Medical research shows that being poor is extremely stressful. You have less control over your life and must cope with much more uncertainty: Will you be able to pay your rent? What will you do if you lose your job? Are your children safe walking home from school?  As anyone who has ever been addicted to tobacco knows, being anxious makes you reach for a cigarette.

Military veterans under the care of the Department of Veterans Affairs (VA) health care system are also more likely to smoke than other Americans. Indeed, a 2004 report titled “VA in the Vanguard: Building on Success in Smoking Cessation” points out that “the prevalence of smoking is approximately 43 percent higher” among these veterans than in the general population.  “Many Americans who may have never smoked prior to their military service began smoking while in the service,” the report observes.  In the past, “ ‘Smoke ‘em if you‘ve got ‘em’ was a common command, and in many cases was even encouraged as it was thought to help keep soldiers alert and awake—or to help them cope with the tedium of waiting while on watch and the stress of combat.”

Continue reading

31 COMMENTS SO FAR -- ADD ONE

The NIH: Past, Present, and Future

Like so many other federal agencies, the National Institutes of Health (NIH) has struggled under the Bush Administration, and today, it needs to be revitalized. Not long ago, I wrote about what we can expect for the FDA when president-elect Obama comes to office; now I’d like to turn the NIH. But to understand the agency’s future, one needs to recognize its recent past.

The Bush Years: Starving the Beast

From 1998 to 2003, the NIH enjoyed a golden age. Over that span, the agency’s budget doubled to $ $27 billion, an increase that Harvard University president Drew Faust has called a "transformative force for biomedical research.” But since 2003, NIH funding has remained essentially flat and, when adjusted for inflation, it has actually declined.

This has caused concern within the medical research community While 10 percent of the agency’s budget funds in-house research, a whopping 85 percent supports biological and medical research at universities and medical centers. When the NIH has less money, it has less money to give—and more researchers on the cusp of biomedical breakthroughs miss out on the funding they need.

Last year, the Group of Concerned Universities and Research Institutions (GCURI)—an association of seven top-tier universities including Harvard, Duke, Johns Hopkins, and Brown—issued a report arguing that reduced funding for NIH means “slowing the pace of medical advances, risking the future health of Americans, discouraging [the country’s] best and brightest researchers, and threatening America's global leadership in biomedical research.”

Indeed, as the NIH budget has shrunk, researchers have had a harder time securing grants: according to GCURI’s report, the agency funded 32 percent of proposed research projects in 1999, but only 24 percent in 2007. Researchers who are awarded NIH grants also have to jump through more hoops than they did in the past. In 1999, 29 percent of grant proposals were approved upon first submission; in 2007, only 12 percent of projects were given the same first-time approval.  These days, 88 percent of researchers who end up with NIH funding do so after applying multiple times. According to GCURI “this trend represents a clog in the system that is causing researchers to abandon promising work, downsize labs, and spend more time searching for other financial support. Meanwhile,” the report continues, “Americans wait longer for cures.”

There’s no reason to think that the quality of grant proposals between 1999 and 2007 has dropped precipitously enough to warrant a stingier NIH. Good scientists are being left high and dry. The agency’s primary research grant—the so-called R01 grant—is generally regarded as the “gold standard” in science: when the government grants an R01 to a project, that research is officially legitimated as important, ground-breaking work. In fact, GCURI claims that “a scientist is not considered established and independent until he or she is awarded an R01, which…enable[s] scientists to hire staff and buy [the] equipment and materials necessary to conduct experiments.” Or, as Dr. Denis Guttridge, Associate Professor at The Ohio State University, puts it: “assistant professors cannot get going in their careers until they get their first R01.” Thus allowing federal grant money for medical research to shrink puts our country at risk of “los[ing] a generation of committed scientists” and the medical breakthroughs that they can provide.

Continue reading

33 COMMENTS SO FAR -- ADD ONE

Can the Media Derail Health Care Reform?

By now you’ve probably heard the calls for speedy action on health care reform during the Obama Administration’s first hundred days. Some prominent observers even say that the President-elect should get the ball rolling during “his first days in office” The possibility of imminent health care reform is certainly exciting, but a word of caution: just because some of us might be ready for health care reform doesn’t mean that the media is ready to cover it properly. And that could have important implications for how reform plays out.

Right now, health care reform is an abstract goal that everyone wants. Excitement and anticipation are high. But as the substantive process of health care reform gets underway, two things will happen: first, ideas will be crafted into policies—concrete plans of action and complex administrative measures—and second, politicians will become involved in the reform process. Policy can get pretty complicated; so the public will rely on the media to help it navigate the ins and outs of the issue. Once politics begins to shape policy discussions—that is, once politicians enter the picture—it’s all the more important to keep the focus on policy, because it’s at this point that policies have a real chance of being implemented. Americans should know their options.

Style Over Substance

Unfortunately, reporters aren’t health care policy experts. In fact, they rarely ever talk about the issue. In a December report, the Kaiser Family Foundation found that, out of 3,513 health news stories in newspapers, on TV and radio, and online between January 2007 and June 2008, health care policy comprised less than one percent of news stories and just 27.4 percent of health-focused stories. Instead of talking about issues like coverage, prescription drug care, costs, or public programs, the media prefers to report on specific diseases and conditions (cancer, diabetes, obesity and heart disease) and potential epidemics (contaminated food and water, vaccines, binge drinking). Together, these two topics comprised 72.6 percent of health coverage.

This is less than ideal. When Congress begins to talk about health reform in earnest, the important news that will affect all of us will be about policy and institutional changes. The media needs to be good at covering this stuff—yet as the Kaiser report shows, news casters, reporters, and editors have very little experience (or interest) in discussing such issues. Worse, history shows that when health care reform efforts are actually underway, the media ignores policy in favor of more sensational stories.

Continue reading

156 COMMENTS SO FAR -- ADD ONE

Alzheimer’s Disease: The Basics

Earlier this month newspapers reported that Columbo—that is, actor Peter Falk—has Alzheimer’s Disease. Usually, when news breaks that a celebrity is suffering from a serious medical condition, there’s a flurry of coverage discussing the nature of the disease. Hopefully, the pattern will hold in Falk’s sad case—because Alzheimer’s is both a terrifying disease and a greater public health issue than most of us realize.   

Indeed, the incidence of Alzheimer’s Disease (AD), is rising. According to the Centers for Disease Control and Prevention, in 2006 Alzheimer’s disease was the sixth-leading cause of death in the U.S., killing 72,914 people. Another startling number: Alzheimer’s as a cause of death has skyrocketed in recent years, increasing by 33 percent between 2000 and 2004.

So What Is It?

A progressive brain disorder, AD literally shrinks the brain, eroding individuals’ memory, language, and their ability to coordinate basic motor skills like swallowing, walking, and bladder control. These deficiencies can lead to other serious problems: an inability to swallow can cause food to be inhaled, which can lead to pneumonia; not walking can lead to painful bedsores prone to infection; and incontinence can also lead to infections.

In other words, Alzheimer’s is a frightening disease that gradually can take over the mind and body.  Unfortunately, there is no known cure, and currently no medical tests that allow us to diagnose the disease with 100 percent certainty—doctors need to cut open the brain in order to tell for sure that it’s afflicted with AD.

Further, no one knows for sure what causes Alzheimer’s, though researchers do have some understanding about what happens to the brain during the disease. The culprits are two abnormal structures called plaques and tangles, which together kill nerve cells in the brain. Plaques build up between nerve cells and deposit proteins that impede normal neurological functions; tangles are knots of protein that build up in brain cells and collapse the structures needed to transport vital nutrients across the brain.  

Doctors aren’t entirely sure what causes the growth of plaques and tangles. Genes might play a role, but researchers don’t know just how—or how much—they matter. That’s due in part to the fact that Alzheimer’s, when it’s genetic, is not caused by a single gene, but rather mutations on multiple chromosomes. Sadly, this information is not as useful as it may seem: according to the National Institute of Aging (NIA), less than 10 percent of AD patients have “familial Alzheimer’s”, i.e. a genetically inherited form of the disease. Onset of familial AD is early, before the age of 65.  The other 90+ percent of Alzheimer cases are late-onset (after 65), and according to the NIA, this form of the disease “has no known cause and shows no obvious inheritance pattern.” Researchers have a hunch that genes play some sort of role in late-onset AD, but “only one risk factor gene has been identified so far” and it’s not enough to account for the entire disease.

Continue reading

5 COMMENTS SO FAR -- ADD ONE

The Future of Pharma

Over the next few years, drug makers are likely to face many new challenges, including government approved importation of cheaper drugs,  Medicare negotiating for lower prices, stricter regulations of direct-to-consumer advertising, and (hopefully) a more robust FDA under the Obama Administration. With so many changes afoot, Big Pharma will have to evolve or suffer the consequences. Even drug executives see the need for a restructuring of the industry. Earlier this month, the head of pharmaceuticals at Roche told reporters that the “marginally-different-and-market-it-like-hell model [of prescription drugs] is over.” But if that’s true, then what new model will take its place—and will it be any less troubling?

Winds of Change

One of the biggest indicators that change is on the horizon is the fact that spending on prescription drugs isn’t what it used to be. In fact, according to a recent Health Affairs article authored by Murray Aitken of the consulting firm IMS Health, Ernst Berndy of MIT’s Sloan school and David Cutler of Harvard, sales are beginning to level off. Though “U.S. spending on prescription drugs grew 9.9 percent annually between 1997 and 2007,” since 2003 “growth rates have declined rapidly”—to their slowest since 1974—“and in 2007 spending grew but 1.6 percent” after growing by 8.5 percent in 2006—the first decline in spending growth on record.

A major reason behind the slow-down is that drug makers are simply running out of new drugs to sell. Aitken et al. note that, “according to the FDA, between 1999 and 2001 the average total number of…new product approvals was about thirty-five per year, whereas between 2005 and 2007 this number fell to about twenty.” And as time goes on, newer drugs comprise a smaller share of drug sales: “Products introduced within the prior five years accounted for 34 percent of total drug sales in 1999” but “that share has declined steadily since then, to just 19 percent of total sales in 2007.”

Fewer new drugs mean fewer new patents, which limits drug makers’ ability to keep revenues high through monopolistic pricing. Over time, the value of brand-name drugs on the cusp of losing their patents—and thus becoming vulnerable to competition from cheaper generics—has almost doubled, “from an average of about $9 billion per year between 2002 and 2005 to about $16 billion in 2006-07.” Health Affairs points out that “the list of drugs losing patent protection in recent years has been substantial: Norvasc (value: $2.6 billion), Lotrel ($1.5 billion), and Flonase ($1.2 billion). Moreover, drugs likely to come off patent protection soon include Cozaar in 2010; Lipitor, Plavix, and Seroquel in 2011; and Diovan, Viagra, and Evista in 2012.

When drug makers lose blockbusters—that is, drugs with sales of $1 billion or more—they take a big hit. A 2004 BusinessWeek article cited a Boston Consulting Group study which estimated that “80% of growth for the 10 biggest drug makers during the last decade came from the eight or so blockbusters a year launched during the 1990s.” Aitken et al. note that “spending on blockbusters increased from about 12 percent of all sales in 1996 to almost half of all sales in 2006, accounting for three-quarters of prescription drug spending growth over the same time period.” Unfortunately for drug companies, blockbusters are on the decline: “in 2007, for the first time, the number of billion-dollar products fell—from fifty-two to forty-eight—and their share of all sales also fell slightly, to 44 percent.” More bad news for pharma: “As more blockbusters go off patent and fewer new ones are developed, the share of sales attributable to blockbuster molecules will likely decline still further.” In other words, drug companies need to find a new cash cow. But where to look?

Toward Specialization

Continue reading

4 COMMENTS SO FAR -- ADD ONE

Physician-Assisted Death in the US

Last month, voters in Washington State voted 58% to 42% to allow physician-assisted death (PAD) for terminally ill patients, making it the second state after Oregon to allow such a practice. In a recent New England Journal of Medicine article covering this development, Dr. Robert Steinbrook notes that Washington’s “Death and Dignity Act…permits…[adult] state residents…with an illness expected to lead to death within 6 months to request and receive a prescription for a lethal dose of a medication that they may self-administer in order to end their life.”

The law, which will take effect on March 4, 2009, is based closely on the Oregon PAD law, which has been in effect since October 1997. Steinbrook points out that Oregon’s legalization of PAD has had some interesting effects—or rather, non-effects—on the number of patients who have exercised their “right to die.” Between 1998 and 2007, “physicians wrote a total of 541 prescriptions for lethal doses of medications…and 341 people died as a result of taking the medications. Thirteen patients who had received prescriptions were alive at the end of 2007, and the rest of [the 541 people] who received prescriptions ultimately died of their underlying disease.”

These are not huge numbers: 341 people over nine years comes out to about 38 terminally ill people per year seeking to end their lives. In other words, PAD has not turned out to be a slippery slope toward mass suicide. In fact, most Oregonians who sought PAD between ’98 and ’07 belonged to a relatively predictable demographic: they were old (median age of 69), suffering from terminal cancer (81.5%), and were enrolled in hospice programs (86%).

This last point is particularly interesting. Steinbrook suggests that a shift toward hospice care within the medical community may be associated with an increase in PAD because hospice care tends to “address many of the key reasons why patients request assistance in dying — such as loss of autonomy, dignity, and the ability to care for themselves in a home environment.” Certainly a growth in hospice care doesn’t necessarily mean that more patients will seek out PAD. But given what we’ve seen in Oregon—and hospice care’s focus on making patients comfortable with the fact that they are dying—a growth in hospice care could very well put more people in a position to do just that.

Continue reading

5 COMMENTS SO FAR -- ADD ONE

WSJ : Don’t Worry About Drug Safety

The Wall Street Journal has some of the best health care reporting of any major newspaper, yet its editorial page is often filled with shrill, misleading nonsense—particularly when it comes to health care. Unfortunately, this week some of the rhetoric of the WSJ’s opinion section seems to have leaked into its reporting: on Tuesday, the paper ran a piece warning that “too much information about drug safety—disseminated through media, online alerts from consumer watchdog groups and even by the Food and Drug Administration itself—might overwhelm patients and raise undue alarm.” Essentially, the article suggests that, when it comes to prescription drugs, the less we know, the better.

The story’s author, Shirley Wang, provides little evidence that America is too concerned about drug safety. As evidence to support her argument, she offers a Pfizer survey of 300 medical professionals which “found that 89% of respondents were at least somewhat concerned that patients might stop their medications if potentially negative safety information was released to the public too early.”

I’m not entirely sure why this is news. Of course a drug company is going to release a survey that hints at the dangers of excessive regulation and oversight. And of course doctors are going to be “somewhat concerned” about the science behind drug risks; I’d wager that just as many are “somewhat concerned” about the science behind reputed drug benefits as well. Good doctors will always be concerned about the integrity of data that will affect the behavior and health of their patients. Pfizer’s survey doesn’t tell me anything I don’t already know; nor is it proof that doctors think our health care system in fact does release negative safety information too early

Unfortunately, the rest of Wang’s article is just as speculative. For example, she notes that in 2004 the FDA “required a so-called black-box warning label—the agency's toughest—on antidepressants to caution about the increased risk of suicidal thoughts and behaviors among teenage patients.” Following the re-labeling, “the number of prescriptions for the drugs decreased” and “the rate of teenage suicides went up.” This would be scary except for the fact that it “isn’t clear” whether or not “the higher suicide rate is linked to the lower number of prescriptions.”

Continue reading

13 COMMENTS SO FAR -- ADD ONE

Retiree Health Benefits in the Recession

“Companies are concerned about how their balance sheets are going to look after two down quarters last year, the events of Sept. 11 and the big increases we've been seeing in health care costs,” the U.S. Chamber of Commerce’s Kate Sullivan told the New York Times in 2002. Reading the tea leaves, Sullivan pointed out that “employers are looking at any way they can to shave off some of those costs”—and that one of their biggest targets were company-sponsored health benefits for retired employees.

Today, six years later, the article's headline—“Retiree Health Benefits Dwindle Amid Recession"—should appear in bold type on the front page of the Times.. In many ways 2002 was part of the good old days, before another half-dozen years of health care inflation culminating in our current, brutal recession. If businesses felt inclined to spend less on retirees’ health benefits back then, today they are even more likely to cut back. .

Indeed, in a recent survey, the Commonwealth Fund found that 53 percent of private employers plan on increasing retirees’ shares of their health care premiums over the next two years. Forty-three percent say they will be increasing cost-sharing for drugs; 19 percent intend to drop retiree benefits for new hires, and 20 percent of companies plan to drop company-sponsored health benefits for active workers or existing Medicare-age retirees. In other words, employers are trying to lighten their load.

Retiree health care is a  particularly tempting target because it stands at the center of corporate America’s  most expensive benefits. In November, the Employee Benefits Research Institute (EBRI) reported that, by 2007, retirement benefits accounted for 47.7 percent of the total spending for benefits, while health benefits had increased to 42.8 percent of total benefit spending. Other benefits” (unemployment insurance, life insurance, and workers’ compensation) accounted for just 9.5 percent of companies’ benefit expenses.

Continue reading

5 COMMENTS SO FAR -- ADD ONE

Why Patients Don’t Use Rating Systems That Compare Health Care Providers

The following tidbit was buried within the Kaiser Daily Health Policy Report a few days ago: “Fewer Patients Using Health Care Provider Quality Ratings Web Sites To Make Decisions.” The headline could just have easily read: “More Bad News for Consumer-Driven Medicine.”

One of the most persistent dogmas of the consumerist crowd is that patients are eager to comparison shop for health care—and that, if they aren’t doing so today, it’s only because they don’t have the necessary information. Supposedly, if we had more resources like the website Carol.com—which allows providers to list their services in a comparative “marketplace of care”—then consumers would empower themselves with information and make rational choices on the cost and quality of care.

But according to an October survey from Kaiser, people just don’t comparison shop for health care. In fact, only one in seven (14 percent) of Americans “say they have seen and used information comparing the quality among different health insurance plans, doctors, or hospitals in the past year.” At the same time, 30 percent of Americans say that they came across comparative quality information over the course of this year—which means less than half of patients who come across comparative data on health care providers actually use it. 

Continue reading

13 COMMENTS SO FAR -- ADD ONE