When looking to examine Patient Centered Medical Home data (click here for my overview of PCMHs), there’s no better place to start than the 26-month evaluation of the multi-site National Demonstration Project. Begun in 2006, with results published in 2010, this federally funded project included an array of repeated cross-sectional surveys and medical record audits at baseline, 9 months, and 26 months, using patients from 36 family practices that were randomized into two groups: those practices that received facilitation to become PCMHs and those self-directed in their PCMH adoption. After all this money and time, the study found “no significant differences between groups” and “no improvements in patient-rated outcomes.” In fact, the only change found was minor improvement in scores on surveys that providers did about their care. Or, as the conclusion notes, “After slightly more than 2 years, implementation of PCMH components, whether by facilitation or practice self-direction, was associated with small improvements in condition-specific quality of care but not patient experience.”
New York state’s Medicaid program, which includes over 5 million enrollees and over 13,000 primary care physicians, markedly expanded payment for PCMHs, beginning in 2010. The summary report from April 2013 is, in many ways, both typical and chilling. After a glowing introduction and overview, including a background section that emphasizes the potential of PCMHs (without discussing the evidence, or lack thereof), the document has some notable findings. Three years into the program, despite an impressive amount of incentive money, only 34% of primary care providers were willing to become PCMHs. The graph on page 9 shows the growth in participation flattening out after one year. By mid-2012, 38% of New York state’s 5 million Medicaid patients were part of the project through their doctor’s participation.
Between 2010 and 2012, New York spent an estimated $398,947,964 taxpayer dollars (which includes a $250 million lump sum payment to hospitals and training centers) – money designated for the care of New York’s poorest, and often most medically fragile and disabled patients – to implement this certification bureaucracy (I’ll address the costs of programs like this in Medical Homes Part III: The True Cost). But for all the money and effort spent on New York’s PCMH bureaucracy, starting on page 14 of the report are some very underwhelming results. Most striking is the difference between the generally glowing written summary in favor of PCMHs (e.g., “These analyses show that PCMH practices have higher rates of quality performance, as defined by national standardized measures, than non-PCMH practices for a majority of measures after controlling for differences in enrollee case mix.”), and the actual reported numbers. Minor differences are noted, with simple adult and pediatric BMI measurements being the rare and largest difference in the groups to favor PCMHs. In many cases, non-PCMH providers performed equally as well as PCMH ones in preventive services, and outperformed them with a large difference in non-PCMH providers’ “avoidance of antibiotics therapy in adults with acute bronchitis.”
Conspicuously, data show that patients in PCMHs had higher rates of ED visits as well as higher rates of both overall andpreventable admissions to the hospital. Here’s how the shocking results are described in the discussion: “The utilization results, however, while preliminary, do not at this time show changes in the expected, or desired reductions in ER visits or inpatient stays.” It’s unclear why utilization results are deemed preliminary, but the outcome data are not. Nor is it clear why “at this time” is a pertinent modifier for only these poor results. Patient satisfaction results, a core component of evaluating patient-centeredness, is also notably missing, explained away by “as surveys are based on a sample of enrollees, 65% of whom do not respond, there is often not enough data to draw meaningful conclusions.”
In terms of other large PCMH projects, the Safety Net Medical Home Initiative was one of the largest nationwide PCMH implementation programs. Evaluation of it showed that staff resistance and turnover were obstacles. A more in-depth look at 5 safety net clinics in New Orleans showed more pressing problems than PCMH status, including “a need to focus on clinic finances.”
One notable study shows a marked benefit from PCMH implementation in a military center, with 7% improvement in access to care, a whopping 75% decrease in ED utilization and increased staff satisfaction scores after two years. Despite this, a larger VA study found more challenges and variability in just getting programs implemented, particularly around issues of open access to care. These conflicting results at clinics that operate within a military culture point to the difficulty in creating a standardized successful PCMH implementation.
An overview of published PCMH studies generally finds the data tending toward positive results, but plagued by “methodological and measurement issues,” a sentiment echoed by another review of the evidence. Drilling down, a large study of 58,391 patients seen at one of 22 medical groups between 2005-2009 found that any cost savings were limited to only the most medically complex, with some net increases in utilization and costs among other groups. A study of 27 Minnesota-based medical groups found little to no overall correlation between PCMH and diabetes care costs. Two Group Health Cooperative reports, from a study that is frequently touted as demonstrating “the potential” of PCMHs, when viewed critically, actually showed spotty improvements in some patient experience scores, no significant differences in outcomes, and no difference in overall costs with PCMHs, even for seniors.
Michigan researchers reported on the complexity and difficulty of just creating tools to measure PCMH assumptions — “13 functional domains with 128 capabilities within those domains.” Despite the fact that the whole goal of PCMH is standardization, they dispiritedly conclude: “a one-size-fits-all approach may not be appropriate.” PCMH standardization, tools and accreditation also do not, apparently, obliterate racial/ethnic differences in care. A study of 1,457 adults receiving care from 89 medical providers within a PCMH-designated practice documented “racial differences in [both] processes and intermediate outcomes of diabetes care…” Another group concludes that after 15 years of NIH-funded projects “primary care transformation is hard work.”
In the face of such underwhelming results, after all the money and effort invested in a bureaucracy that does not contribute to actual patient care, several proponents (including the authors of the New York State summary) suggest that PCMHs just need a little more time to get established. Geisinger Health System touts its proprietary “advanced model” of PCMH. But even they state that, when it comes to better patient outcomes “there is only limited evidence regarding the ability of PCMHs to achieve this goal.” In terms of cost, their analysis of their own product shows “a longer period of … exposure was significantly associated with lower total cost.” However, their calculated return on investment was grimly low, with a large confidence interval range.
Finally, and most importantly, where is the patient experience in all of this? If PCMHs do not clearly improve outcomes and cannot be consistently shown to decrease cost (and may actually increase preventable cost), do they at least make things more patient centered? Unfortunately, the answer appears to be no. As the evaluators of the 2-year National Demonstratin Project put it, “highly motivated practices can implement many components of the PCMH in 2 years, but apparently at a cost of diminishing the patient’s experience of care.”
Stay tuned for Medical Homes: Part III, The True Cost, to find out just how much of our health care dollars have been spent on this burgeoning bureaucracy.