Applying Research Evidence to Support Complex Care Program Design: Seven Lessons from the Field

Allison Hamblin and Logan Kelly, Center for Health Care Strategies

Many health care and social service organizations have launched or are considering programs focused on providing care and non-medical support for people with complex health and social needs. Yet, while there is increasing research available on the impact of these programs, it can be challenging to understand how to apply this evidence to real-world environments.

Studying the effectiveness of these programs poses different and sometimes harder challenges than studying the impact of a pure clinical intervention. There may be conflicting studies on some types of complex care interventions, and few or no formal studies on others.

Through the Better Care Playbook, the Center for Health Care Strategies (CHCS) recently brought together a virtual panel of innovators in the field to discuss the challenges in applying evidence when designing and refining complex care programs. The virtual panel was presented in October 2021 at the annual “Putting Care at the Center” conference hosted by the Camden Coalition’s National Center for Complex Health and Social Needs.

The panel discussion featured: Parinda Khatri, PhD, Chief Clinical Officer, Cherokee Health Systems; Damon Francis, MD, Medical Director, Homeless Health Center at Alameda Health System, Chief Clinical Officer, Health Leads; and Michelle Wong, MPH, MPP, Director, Care Management Institute, Kaiser Permanente. The three panelists shared lessons and challenges in how they have translated research findings into their decision making on program design and improvements.

Following are key themes that emerged from their discussion:

1. Many interventions may be adaptable to new populations or settings.

Complex care program stakeholders can use the PICO framework (population, intervention, comparison, outcome) to help identify how each of those elements in the research compares with the relevant program of interest. Another approach that can be helpful is to conduct a literature search, and then identify the common elements in the literature that are consistent across studies. These approaches aim to clarify what matters most for example, what intervention components are most critical, and what is the population of focus.

However, complex care is more than the sum of these component parts. As Wong described, “in complex care, the secret sauce is in the relationships, patient-centered care planning, and partnership with patients and families.”

While there might not be direct evidence for a particular program idea, there may be evidence supporting a similar intervention for a different population. For instance, the Homeless Health Center at Alameda Health System in Oakland, California is considering a drop-in primary care program for people with high levels of health-related social needs. It found substantial research support for the drop-in primary care model as applied to people with HIV. Those findings may be salient even though the [population and] studied outcome in the HIV programs viral load suppression is different from what the Homeless Health Center would be targeting, said Francis.

2. Involve patients/consumers from the start in designing, overseeing, and evaluating programs.

It’s crucial to design the intervention with strong input from consumers of the program, because they bring ideas, based on their lived experience, that program leadership may never have considered. To develop patient-centered processes and systems, health care organizations must meaningfully engage consumers. Federally qualified health centers are one example of including consumers in organizational governance, as consumers must be a majority of members on these boards of directors. “The consumer is the most underutilized resource in health care, and they need to be engaged at every level, including as citizen scientists,” Khatri said.

Engaging diverse consumers requires being willing to accept disagreement and conflict in designing and evaluating an intervention. “Are you doing this for marketing purposes, or do you really want to learn?” Wong asked. “If you’re looking to learn, you really need to embrace friction.”

3. When designing interventions, consider lessons from both research-based and practice-based evidence.

While the gold standard of evidence is often considered to be a randomized controlled trial, there are more limited examples of such rigorous, peer-reviewed studies in complex care as compared to other fields. When designing programs, stakeholders can access this still nascent but growing body of research-based evidence published in journals, as well as evidence generated through experience and practice at other sites and in other related programs. Understanding the experience of others who have tried a similar approach can provide significant value and insights. Stakeholders can benefit from reading reports or presentations, case studies, or learning directly from informal calls or site visits to see a program. Stakeholders can access both evidence-based and practice-based research on the Playbook.

Francis noted how these different types of evidence can have different uses. “In my experience, research evidence is typically more valuable for selling a project to a supervisor or funder. Practice-based evidence tends to be more valuable for adaptation and implementation. But both are valuable.”

4. Programs should partner with researchers to support continuous learning about what works.

Researchers may be accustomed to more traditional experimental designs, but that may not work in studying complex care interventions. “It’s not as easy as looking at a biomarker and then titrating a medication and examining the impact on morbidity and mortality,” said Wong. “ Researchers and practitioners can partner together to advance our learning toolkit to help guide practitioners, funders and policymakers. ”With complex care interventions, it often takes longer to see the outcomes. “One thing we can do is have both the practitioners and researchers be clear about the desired outcomes and timeline at the beginning, so that we can create metrics for evaluation along the way and show the value of these programs in a much more precise way,” Wong added.

There also needs to be more detailed description in studies of complex care interventions about resource limitations, such as constraints on funding, physical space, information technology, and staff flexibility, for example due to labor union contracts. Those “practical realities” can be “really, really critical for implementation,” Francis said.

5. Take adequate time to train staff in program implementation and data collection.

There is always pressure to launch a program as quickly as possible, with the thought that staff can just be put out in the field to start offering services. But it’s crucial to train them in what data to gather and how to capture the information. Having the right data is necessary for both understanding the interventions that were implemented as well as the outcomes that were achieved. “We’ve implemented some beautiful interventions and realized that we never really clarified where people will be entering the data,” Khatri said. “Put in the work beforehand so your staff knows exactly what data you want.”

6. Factor in longstanding racial and social inequities when evaluating the impact of an intervention.

When evaluating the measurable impact of a complex care program over one to two years, or even five years, everyone must understand that’s a relatively short timeframe in relation to a very long history of structural inequities. “We have to put the interventions in the context of that history,” Francis said. “Five years of perfect implementation is not the fundamental approach to dealing with that reality. We have to lift up our voices as complex care consumers, practitioners, and researchers in policy conversations, and use our data to drive these larger cultural and political and economic issues that are the major contributors to inequity.”

7. Don’t let perfect be the enemy of good in launching an intervention.

A complex care intervention is, well, complex. It won’t be perfect when you plan it. Don’t delay too long in getting started, because rapid-cycle evaluations and refinements can always be implemented. “If we’re looking for perfect and clean, we’re not in the right field,” Khatri said. “So just move forward, learn, and give yourself and others grace. This is deep and incredibly meaningful work.”

Developing a strong evidence base for complex care interventions is key, because policymakers and funders are more open to supporting these programs in the era of value-based care. “We need to build the evidence base to really be ready to implement programs at scale, with a how-to guide for practitioners,” Wong said. “We need to articulate to funders the core elements of the work and success so that we can make the value case for sustaining the work in the future.”

For the latest in complex care promising practices and emerging evidence, visit the Better Care Playbook, an online resource center coordinated by CHCS and supported by Arnold Ventures, the Commonwealth Fund, the John A. Hartford Foundation, the Milbank Memorial Fund, the Peterson Center on Healthcare, the Robert Wood Johnson Foundation, and the SCAN Foundation.