What’s in a Name: Is Implementation Science Repackaged Process Evaluation?

This article was written by Taren M. Swindle. Dr. Swindle is an Assistant Professor in Family and Preventive Medicine within the College of Medicine at the University of Arkansas for Medical Sciences. Broadly, her research program focuses on understanding and improving health and developmental outcomes for children impacted by poverty. Dr. Swindle has a particular focus on obesity prevention and nutrition promotion for young children in low-income families. Her work to date has focused on the early childcare setting as a key context for obesity prevention and nutrition intervention. She is interested in increasing adoption of evidence-based practices and interventions in community settings such as this through application of Implementation Science.

 

In this accompanying commentary, SISN’s Past President, David Pelletier clarifies and highlights the differences between the NIH definition of implementation science (IS) in US settings and SISN’s broader conceptualization of IS.

“Isn’t implementation science just good process evaluation?” an attendee asks. Nearly every time I (or others) introduce concepts of implementation science to nutrition colleagues, I hear the same question.  Indeed, this question was asked at the end of a SISN webinar as recently as December 2019. The consistency and frequency of this question, even as implementation science matures into an established field, demands a clear and documented reply. Most succinctly, the answer is “no.”  More verbosely, process evaluation is one tool, among many tools, consistent with a focus on implementation in our intervention work; however, process evaluation does not begin to encompass the aim of implementation science.

Process evaluation “determines whether program activities have been implemented as intended” according to the United States Centers for Disease Control (CDC).1   The CDC also explains that process evaluation answers the “who, what, when, and where” questions of implementation. For example, who was reached? What are barriers and facilitators to quality implementation? When did intervention sessions get deployed? Where did end users engage with the intervention? Process evaluation takes place during the delivery of an intervention.  Process evaluation helps to document if the program delivery occurred as designed and reached the intended audience. A quality process evaluation will provide data on why or why not an intervention had the intended effects.  Process evaluation joins other evaluation tools (e.g., formative evaluation2) in the tool box for conducting valuable science around interventions.

Implementation science is  “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services.”3 In implementation science, scientists ask research questions which focus primarily on implementation outcomes4, such as fidelity, acceptability, feasibility, adoption, and sustainability.  Oftentimes, implementation scientists want to know which strategies work best to promote these implementation outcomes. To date, implementation science has documented over 70 implementation strategies.5  Implementation studies often compare a single strategy to standard of implementation, test two strategies head to head, and bundle strategies in packages for comparison to other single strategies or alternate packages. The selection and target of implementation strategies is informed by implementation theories, models, and frameworks6 as well as stakeholder engagement,7 a key value of implementation science.

Effectiveness- implementation hybrid designs8 provide us an opportunity to understand the distinction between process evaluations and implementation science.  Hybrid Type I designs have a primary focus on effectiveness of an intervention; process evaluation is used in Hybrid I designs to document implementation process.  Hybrid Type II designs concurrently focus on intervention effectiveness and testing an implementation strategy and collecting implementation outcome data. Hybrid Type III designs primarily compare the effects of differing implementation strategies on implementation outcomes while secondarily collecting health outcome data. Hybrid Type II and III designs move away from pure effectiveness research and toward classical implementation science.  The flagship journal for the field, Implementation Science, no longer accepts Hybrid Type I studies for publication. This reflects this field’s emphasis and focus on testing and developing implementation strategies rather than simply documenting implementation process.

A concrete example will solidify the distinction. A randomized control trial which evaluates the effect of a community-based breastfeeding intervention while documenting dosage, fidelity to the intervention, and reach to the targeted population employs process evaluation. However, this study (as described) does not have an implementation science research question. A study which compares the effect of supporting the implementers of the breastfeeding intervention with reminders and audit and feedback reports (Group A) versus online training and incentives for implementer fidelity (Group B) provides an example of an implementation science study. The research question in the latter example will focus on comparing implementation outcomes in the two conditions to inform future implementation and scaling of the intervention. As these examples illustrate, implementation science’s primary focus on testing changes in process is a key distinction from process evaluation alone which would document implementation outcomes. Conversely, implementation science seeks to rigorously test structured manipulations of process, rather than record process.

As the intersection of nutrition research and implementation science appropriately grows, the distinction between process evaluation and implementation science will become increasingly important. Nutrition researchers can build on a familiar concept and skill in process evaluation toward asking implementation science questions about the best implementation strategies to promote the desired implementation outcomes.  When the inevitable question comes from the audience, may we all be equipped to describe the distinction between process evaluation and implementation science.

References

  1. Centers for Disease Control. Types of Evaluation.; 2014. http://www.cdc.gov/std/program/pupestd.htm. Accessed February 5, 2020.
  2. Stetler CB, Legro MW, Wallace CM, et al. The role of formative evaluation in implementation research and the QUERI experience. J Gen Intern Med. 2006;21 Suppl 2:S1-8. doi:10.1111/j.1525-1497.2006.00355.x
  3. Eccles MP, Mittman BS. Welcome to Implementation Science. Implement Sci. 2006;1(1):1. doi:10.1186/1748-5908-1-1
  4. Proctor E, Silmere H, Raghavan R, et al. Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38(2):65-76. doi:10.1007/s10488-010-0319-7
  5. Powell B, Waltz T, Chinman M, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.
  6. Nilsen P, Pratkanis A, Leippe M, Baumgardner M, Hardeman W, Jonston M. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53. doi:10.1186/s13012-015-0242-0
  7. Holt CL, Chambers DA. Opportunities and challenges in conducting community-engaged dissemination/implementation research. doi:10.1007/s13142-017-0520-2
  8. Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217-226. doi:10.1097/MLR.0b013e3182408812

 

We thank Taren Swindle for sharing her expertise in this blog post. To learn more about her work on implementation in nutrition follow her on Twitter @taren_swindle.

Disclaimer: The views, information or opinions expressed in the article are those of the individual contributors and do not necessarily reflect those of SISN or any other agency, organization, employer or company.

Have an idea or a comment on any of the issues discussed above? We welcome your feedback – you can comment on this post on our LinkedIn feed or write to us at info@implementnutrition.org.