- Buy OEM iSkysoft iTube Studio 2 MAC
- Discount - Autodesk Mudbox 2011 (32-bit)
- Discount - Nik Software Silver Efex Pro 2
- Buy OEM SmartSound SonicFire Pro 5 Scoring
- Buy OEM Microsoft MapPoint 2013 Europe
- Buy Cheap Autodesk Revit MEP 2011
- 199.95$ Corel Painter X3 MAC cheap oem
- Discount - Lavalys Everest Ultimate 4.5
- Buy OEM Autodesk 3Ds Max 2009
- Buy Magix Samplitude 11 (en)
- Buy OEM Autodesk Stitcher Unlimited 2009 MAC
- Buy Cheap Omni Group OmniFocus 2 Pro MAC
|A Closer Look at Hospital Competition: Hospital Quality Becomes Increasingly Important|
|News/Features - Health|
|Wednesday, 19 September 2007 02:57|
Two parallel movements are making hospitals more accountable in terms of their processes and outcomes: an orientation toward consumers, and an increasing emphasis on quality by the organizations that pay for health care - particularly the federal government.
"We're in kind of a new age," said Dr. Tom Evans, president and CEO of the Iowa Healthcare Collaborative, an organization formed by the Iowa Hospital Association and the Iowa Medical Society. "Transparency wasn't really even germane until recently. We couldn't even define what good health care was until 15 years ago."
That's changed, and information about hospital performance is now widely available.
"We are in a better position today to have better data - comparable data - because we are all expected to comply with certain rules about how you collect the data ... ," said Kathleen C. Cunningham, vice president of operations for Trinity Regional Health System. Health-care consumers "can compare us a lot better."
But the general public should be cautious approaching those statistics and ratings. This is a developing field, and "some of it [the information] is very good, and some of it is very green," Evans said.
And even when the data is meaningful - when it's been independently validated and scientifically correlated to better health-care outcomes - it should be considered with some caveats.
Collection and reporting might not be uniform, for one thing.
And health-care rating organizations evaluate the information differently. "There are different folks drawing different conclusions," Evans said. "They have their own ‘special sauce.' You'll draw conclusions that don't necessarily exist."
That concern was supported by a report in the September 2007 issue of Archives of Surgery. According to a press release, "A review of six publicly available hospital-comparison Web sites suggests that they display inconsistent results and use inappropriate or incomplete standards to measure quality."
Furthermore, Evans said, we're in a transitional period in health care, moving from a situation with "woefully inadequate data to this mound of data where we don't even know where to begin."
"I think the consumers need to have some further education to help them understand and interpret the data," said Dr. James A. Lehman, vice president of quality for Genesis Health System.
But there's one conclusion that's beyond doubt. This new information age in health care is "all good news," Evans said. Hospitals want to look good to patients, doctors, governments, and insurers, which "should drive the level of care up."
In next week's article, we'll scrutinize Genesis and Trinity on health-care-performance measures. But before we delve into that discussion, we'll look the environment that has spawned this movement toward quality; the challenges of measuring it; and the ways that it doesn't matter.
A Wealth of Information
Look around the Web, and you'll find a trove of apparently straightforward ratings and statistics.
If you do a Web search for "compare hospitals," you'll run across Quality Check (http://www.qualitycheck.org), a Web site of the Joint Commission (formerly known as the Joint Commission on Accreditation of Healthcare Organizations). There you'll be able to find relatively accessible measures of each hospital's performance in key areas, such as heart-attack care, pneumonia care, and infection prevention. A plus, minus, or check mark designates how each hospital did compared to the nation and its state.
You'll also find the U.S. Department of Health & Human Service's Hospital Compare (http://www.hospitalcompare.hhs.gov), at which you'll see the same areas with more context. The site tells you, for example, that Trinity's and Genesis' hospitals in the Quad Cities area did better than the state and national averages in terms of giving heart-attack patients aspirin upon arrival. The national average was 92 percent, the Illinois average was 93 percent, and the Iowa average was 89 percent. Genesis' Illini campus scored 94 percent, and its Davenport facilities 100 percent. Trinity's Terrace Park scored 97 percent, while its Illinois hospitals were at 96 percent.
It's reassuring that the information is available and that Quad Cities hospitals scored well. But is that measure significant? And why should you care about heart-attack care if you need to get a knee replaced?
Those questions point to some of the issues in assessing quality and distributing statistics about it.
The information on Quality Check is presented simply and is easy to grasp, but it's not clear what's being measured. Hospital Compare, on the other hand, is explicit about the criteria and the comparison, but it's so specific that it's relevance is uncertain.
There has to be the proper balance between clarity and context. For the information to be meaningful to consumers, it has to be presented in a way that they can understand but also with the background information related to its importance.
"Communicating to the public is difficult in the sense that the clinical outcomes that you are looking at are statistically valid, they're based on scientific study," said Bill Leaver, president and CEO of Trinity Regional Health System. "Most people, after I say those two things, their eyes glaze over and they don't hear anymore. So I think it's more challenging to demonstrate and educate the public about those clinical outcomes."
This is a relatively new arena for hospital competition. Hospitals need to educate the public about quality because consumers are increasingly going to be participants in the health-care decision-making process.
"Hospitals have to be able to differentiate themselves" to both consumers and doctors, said Dr. Morris Seligman, vice president of medical affairs for Trinity.
"People will not blindly accept someone saying, ‘Well, you must go here,'" Leaver said. "They're going to do their independent evaluation. ...
"You're already seeing the federal government is starting to demand clinical-outcome data, which they're publishing," he continued. "They believe, clearly, that ... people will use that data to drive decisions."
Better Information Equals Better Care
The quality movement in hospital care has been spearheaded by Medicare, which began publishing data in 2005 on the Hospital Compare Web site. Much of the information you'll find at hospital-comparison Web sites is drawn from data collected by the federal Center for Medicare & Medicaid Services.
"Those are primarily process indicators," Lehman said. For instance, hospitals report the percentage of pneumonia patients assessed and given a flu vaccine. "They're based on good, scientific studies that show that people who get these interventions do better than people who don't," Lehman explained.
Consumers might be surprised about the limited areas Hospital Compare addresses (heart attack, heart failure, pneumonia, surgical improvement), how few indicators there are in those areas (22), and that they largely don't address outcomes. That's a reflection of the difficulty of establishing measures.
"For a number of years, hospitals and payers and regulatory agencies ... have really struggled to find measures of both process and outcome that everybody could agree on were valid indicators of quality for organizations ... ," Lehman said. "What would seem to be relatively straightforward as what is your ... hospital-acquired infection rate to a large extent depended on how an organization defined what an infection is, what their surveillance programs are, ... and how vigorously they were looking for them.
"And so an infection rate for one hospital and an infection rate for another hospital weren't strictly comparable because of the differences in not only those issues, but many organizations will claim rightly or wrongly that they're taking care of a different group of patients, and their patients are more complicated ... . Is it really a fair comparison?"
The indicators themselves are a bit crude. Each indicator is a component of good care for each condition, but it's not the sum of good care.
"People are figuring out how to meet these indicators without really improving care," Lehman said. "So there's that concern that's sort of lurking there: Are we really getting at improving the quality of care, or are we just getting into a mode where we know how to meet those indicators, and it's not resulting in improvement?
"Certainly there's more to the care than just these indicators," Lehman continued. "While these indicators are evidence-based, they're not the whole story."
The measures are evolving and being refined. Under its Premier Hospital Quality Incentive Demonstration in five areas (heart attack, heart failure, pneumonia, coronary artery bypass graft, and hip and knee replacement), Medicare "rolled up" related indicators in each area for a composite score, and rewarded hospitals that performed well with financial bonuses.
After two years of the pay-for-performance demonstration project, participating hospitals improved significantly in each area, from almost 7 percentage points with heart attacks to almost 18 percent with heart failure.
"Their initial review of this is that this program has been successful," Lehman said, "and that by putting incentives into the payment system for good performance on these indicators, that they feel like they've been able to create improvement in health care throughout the country. So they want to see this expand."
This is expected to be a model for future payment systems, in which health-care providers are paid more money when they adhere to groups of interventions shown to improve health-care outcomes.
"That particular way to look at data is the second evolution for Medicare," Cunningham said. That data isn't made public presently, but "I think it will be public record as time goes on," she added.
And that could correct the situation Lehman described, in which individual measures are valued over comprehensive care.
"We could be meeting three of four measures and look very good," Cunningham said. "But if we never meet that fourth measure, are we really delivering the care to the patient that the patient deserves?"
Put more concretely, the individual indicators tell you if a heart-attack patient got aspirin when he or she arrived, but not how quickly he or she was put in a cath lab. As the indicator systems become more sophisticated, they will become better measures.
"It's not whether they got these episodic things, ... [but] did they get everything they should have gotten based on the medical evidence for that particular diagnosis?" Seligman said.
Flaws in the System
So the good news for consumers is that increasing transparency is likely to lead to better health care.
The bad news is that health-care data is still far from perfect.
"A lot of work has been done to get to the very few indicators right now that everybody is defining the same, collecting in the same way, reporting similarly ... ," Lehman said, and that illustrates the difficulty in crafting indicators.
First you need to find consensus on a measure that's meaningfully correlated to an improved outcome. Then you need to agree on definitions and collection methods. Then the collection and reporting need to be uniform within and across hospitals.
That's harder than it sounds, said the Iowa Healthcare Collaborative's Evans, because hospitals' "primary job is to deliver care," not report data.
Concerns about organizational cultures, definitions, patient populations, and detection and reporting systems prompted the National Coordinating Council for Medication Error Reporting & Prevention to declare in 2002 that "use of medication error rates to compare health-care organizations is of no value."
If a patient is given the wrong dosage of a drug three times, Lehman asked, is that one or three errors? "Hospitals we found are counting those differently," he said.
"You don't know with one entity compared to another entity ... how they're putting their data together, even though ... they are to meet a certain guideline or rule of how their data is put together," Seligman said.
Some things might appear cut-and-dried. For instance, the Health Grades Web site (http://www.healthgrades.org) offers mortality information for heart-attack patients - a star rating for in-hospitality mortality, mortality 30 days after hospital stay, and mortality 180 days after hospital stay.
Again, there's a simplicity to the system - as Seligman said, "People understand stars" - and an authority to the numbers behind them, which compare the expected mortality rate to the actual rate.
But "what you may not know is where that data comes from," Seligman said.
You also might not recognize the factors that might skew it one way or another. The mortality numbers are "risk-adjusted," meaning that the expected rate is based on the risk associated with the treated patient population. A healthy adult who has a heart attack has a higher chance of survival than somebody who comes in with a host of medical conditions, and that's reflected in the expectations.
The problem, Seligman said, is that hospital staff might not include enough information on the patient chart to make that distinction, in which case the patient is assumed to be a low risk for tracking purposes.
"The data that would go out to the consumer can be very misleading," Seligman said. "You can make someone look a lot worse than what they really are, or vice versa."
But that's a reporting and collection challenge, and something hospitals have under their control. Systems in which data is publicly available - or where payment is tied to performance - punish hospitals whose statistics are inaccurate because of inadequate or incomplete data collection. That should be a significant incentive to correct reporting deficiencies.
Yet even uniformly collected information isn't ideal. The current monitoring system doesn't give a sense of long-term outcomes. "If they do great, we may not see them again," Lehman said. "And if they die, we may not know that either.
"You may have a number that looks good up-front - they got them out of the hospital - but how did they do down the road? ... What about the people that don't die? How do they do once they leave the hospital?"
When Quality Doesn't Matter
In a perfect world, quality would be a critical factor with health-care consumers. The reality is a little more complicated.
First, there's the issue of comprehension. There's a lot of information out there, and a lot more coming, but do consumers understand it?
"Health care is so complex, and the language is so esoteric," said Bob Travis, vice president of strategic development for Genesis. "How do you comparison shop? How do you know what questions to ask? ... Those are hard for John Q. Public to figure out. They're hard for physicians to figure out, because it's an art as much as it is a science. And there's argument within the disciplines of what's the best way to treat somebody."
"There's always information behind the numbers that unless you've had some actual training in looking at the numbers, the average consumer may not be well-prepared to understand some of the nuances between the numbers ... ," Lehman said.
There's also the issue of a competitive system that's not terribly competitive from a consumer perspective. People aren't necessarily choosing between Genesis and Trinity when they need care at a hospital, and they almost certainly aren't making informed decisions.
Consider a typical situation. A primary-care physician tells a patient that he or she needs a medical procedure. That patient, in most cases, isn't looking at outcome data from each hospital to determine where the procedure will be done. Instead, the primary-care physician will refer the patient to a preferred specialist, and that specialist might have a hospital he or she prefers. If the patient has insurance, the insurance company might further limit options.
"The physicians are the one ultimately bringing in the patients," Seligman said.
And critical care is almost exclusively a function of proximity.
"People don't want to think about their heart 'til they have a heart attack," Travis said. "You're not interested in information until you need it. ... And you don't have much time to do comparison shopping."
So it's incumbent on physicians to base their choices to some degree on the quality of health care provided.
"Quality is a piece of the decision," Lehman said, but only one of several factors. Other components - such as relationships and insurance coverage - might or might not be based on some measure of quality.
"Frequently it's the physician that decides which hospital to use," Travis said. "And so much of our competition is with the physicians, to convince the physicians that we're better than Trinity. And Trinity, of course, tries to convince that physicians that they're better than Genesis."
"I think doctors do have a sense of who takes care of patients and who has the better outcomes," Lehman said, "but I think there's more to those decisions than that one element."
Next week's article will look at Trinity and Genesis head-to-head on quality measures. It can be read here.
Tags See All Tags