Implementation science: a primer

The aim of a primer is to be an introductory but authoritative read (to give the reader a current understanding of the subject at hand, together with references directing them elsewhere for specific information).1 After reading a primer, you should be able to understand the basic building blocks needed and identify any holes for future reading or activity.

Developing a primer on implementation science, and highlighting opportunities for collaborative academic clinical partnership-centric organizations such as JBI Collaboration—all condensed into editorial format—is an imposing task. The field is still relatively new, marked by both interdisciplinarity and lack of consensus on terminology. As such, this primer can only reflect our interpretation of the current state.

The raison d’être for those involved in implementation science are the acknowledged, persistent failures to reliably implement interventions and practices known to be effective, and the risk that patients will continue to receive unnecessary and/or harmful care.2,3 We have 50 years of health services research showing unintended variations and/or gaps in care; we expect over-use, under-use, and mis-use at this point. New evidence is slow to be reliably normalized and inequities in the implementation of evidence-based practices abound.4

Fortunately, much progress has been made in understanding why this remains a tenacious or wicked problem. In particular, implementation science offers frameworks, models, and theories to understand potentially modifiable barriers to the implementation of evidence-based practice.5 Theories, models, and frameworks provide the architecture and guide implementation decision-making. Nilsen (2015) classified theories, models, and frameworks into three typologies that (1) describe and/or guide the process of translating research into practice; (2) understand and/or explain what influences implementation outcomes; and (3) evaluate implementation, including for sustainability.5

For too long, the implementation of changes into practice was envisioned as a linear process. New evidence would find its way into a guideline, and the guideline recommendations would find their way into practice. But the idea that if we simply tell clinicians (or patients) what to do, and it will get done, is probably false.6 In primary care in particular, the issue is more often one of “knowing how” rather than “knowing what” needs to be done. It's often an issue of praxis, and implementation science increasingly involves theories, models, and frameworks that recognize the complexity in the interactions between human behaviours and the structures and processes of care, including their resourcing (or lack thereof).7,8

There are multiple overlapping definitions of implementation science. The journal Implementation Sciences defines it as “… the scientific study of methods to promote systematic uptake of research findings and other evidence-based practices into routine practice, and thereby improve the quality and effectiveness of health services and care.”9 Studying mechanisms of action for the integration of research knowledge, routine practice, health services, and care delivery is a key feature of implementation science, highlighting its interdisciplinary nature.10 It leans heavily on the behavioral and social sciences, including, but not limited to, psychology (cognitive, industrial, health), sociology, engineering (human factors, design thinking), management (leadership, organizational behavior), and other fields. Implementation science is distinct from, but related to, the field of quality improvement.10,11 Those involved solely in the work of attempting (perhaps following systematic approaches or otherwise) to implement evidence-based practice may be thought of as implementation practitioners or, perhaps, quality improvement leaders. The goal of implementation scientists is to advance methods that make their job easier… or even unnecessary.

Implementation science includes rigorous and large-scale intervention development, evaluation methods, economic studies, and a large body of theory-driven inquiry. In other words, implementation science draws on methods used in other fields, including health services research. However, the focus of implementation science is “downstream.” It highlights practices that are already established as the desired thing to do, and on the actors (often a health professional or team rather than just a patient) and systems or organizations capable (or potentially capable) of making those practices routine.10,11 Increasingly, “hybrid” evaluations are utilized; that is, evaluations that simultaneously look at both the outcomes related to effectiveness of the practice itself as well as the outcomes related to implementation. Methods such as step-wedge trials and hybrid designs or qualitative methods can integrate theories, models, or frameworks to inform implementation and evaluation.12

The implementation science literature includes a substantial body of evidence on what “could” work. In comparison, there is a relative lack of progress in understanding what sort of implementation interventions work best, in which situation, and why.12 Implementation scientists are also increasingly turning their attention toward trying to address spread, scale, and sustainability, including issues related to fidelity and/or adaptation of interventions across contexts and over time. In each of these areas, implementation scientists are trying to understand how to reverse the inverse care law and address inequities.13 Addressing these areas will require active partnerships between implementation scientists and organizations capable of conducting large-scale implementation interventions (sometimes known as implementation science laboratories).14 Each of these areas represents exciting opportunities for groups partnered with JBI.

The potential applicability and capacity for globally collaborative groups such as JBI in this field is substantive. The JBI conceptual model for evidence-based healthcare (EBHC) is a meta-model that describes the uptake of diverse types of evidence in response to global health needs through research methods that are reliably synthesized, leading to evidence-informed recommendations.15 In this meta-model, recommendations from synthesis are necessarily transferred to enable implementation. The model is an evidence-centric description of how research becomes implementable; however, the methods and mechanisms of action for implementation are not described. The JBI EBHC model illustrates that diverse types of evidence, systematically evaluated and collated, are needed to respond to global health problems. It also demonstrates that cycles of research, synthesis, and transfer produce results and recommendations suitable for implementation. A seven-phase action cycle takes the high-level constructs from the model and operationalizes them through descriptive processes.16 The seven phases of the model include: (1) identification of practice area for change, (2) engaging change agents, (3) assessment of context and readiness to change (i.e., situational analysis), (4) review of practice (i.e., baseline audit) against evidence-based audit criteria, (5) implementation of changes to practice, (6) re-assessment of practice using a follow-up audit, and (7) consideration of the sustainability of practice changes. These seven phases are conducted within three higher-order domains of activity: (1) pre-planning, (2) baseline assessment and implementation planning, and (3) impact evaluation and sustainability.16

This integration between models (or models and frameworks) presents unique research opportunities to investigate questions related to fidelity of implementation, impact evaluation of embedding process models in implementation, and deepening our understanding of what influences implementation outcomes. It also provides the opportunity to evaluate stakeholder perspectives, outcomes, and the sustainability of implementation, informed by a multi-modal perspective. The use, role, and impacts of integrating the widely available theories, models, and frameworks within JBI implementation methods is both a rich field of inquiry, and chronically understudied. Major gaps in knowledge include the potential benefits of integrating process models, determinant frameworks, the role of classical theory, implementation theories, and evaluation frameworks.

Theories, models, and frameworks (along with the question being asked) guide the choice of implementation research method. Pinnock and colleagues have described the engagement between types of research linked with implementation for a reporting guideline.17 The domains of research include exploration, adaption, preparation, feasibility and piloting, along with implementation and sustainability studies. The overlap with theories, models, and frameworks is clear, and represents an area of significant opportunity to partner or collaborate with the JBI Collaboration and related entities. In the description of Pinnock and colleagues, implementation is not categorized by method or methodology, but by domains that have investigative potential for their alignment with the domains in relevant theories, models, and frameworks.17 The domains cover exploratory/investigative research, contextualization of implementation strategies and resources from one setting or culture into another, and lastly, practicability.

Quality improvement designs form a substantive body of evidence related to implementation, and appropriately so, as healthcare practice and practitioners are primarily concerned with the delivery of safe, quality care.18 In JBI Evidence Implementation, the majority of published quality improvement studies use an uncontrolled before-and-after design with an evidence-based intervention phase of 6–12 months in between. These papers are primarily concerned with evaluating the extent to which structures and processes of care (within the formal Donabedian use of those terms) are compliant with best practice evidence. Along with quality improvement inquiry, there are significant opportunities for academics and clinical partners for parallel studies using ethnography, case studies, qualitative comparative analysis, network analysis, action research, or realist evaluation. These can be used as complementary qualitative methods for studying sociocultural aspects of organizational implementation within a context-rich method of inquiry. In other words, quality improvement activities can be the “laboratory” for implementation science… and implementation science can help inform quality improvement activities by, for example, identifying best practices for interventions or clarifying mechanisms of action.14

JBI has a global reach, established clinical partnerships, and extensive, programmatic approaches to supporting and sustaining implementation, but these have not had sufficient empirical investigation. JBI has the models, methods, and mechanisms of action to challenge and change the status quo. It is time to establish JBI's place, contributing to the science (as well as the practice) in closing the gap. As an unknown author has stated, “although science exerts a powerful influence on health, and health is the explicit or assumed raison d’être of a substantial part of science, overall, the connection between the two cultures is weak.”

Acknowledgments

This work was conceptualized by CSL and NMI and has not been submitted or published elsewhere.

References 1. So, what is a primer? Nat Rev Dis Primers 2020; 6: 34. 2. Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care 2001; 39: (8 Suppl 2): 1146–1154. 3. Schuster MA, McGlynn EA, Brook RH. How good is the quality of health care in the United States? Milbank Q 1998; 76 (4):517–563. 4. Cahill LS, Carey LM, Lannin NA, Turville M, Neilson CL, Lynch EA, et al. Implementation interventions to promote the uptake of evidence-based practices in stroke rehabilitation. Cochrane Database Syst Rev 2020; 10 (10):Cd012575. 5. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci 2015; 10:53. 6. Medlinskiene K, Tomlinson J, Marques I, Richardson S, Stirling K, Petty D. Barriers and facilitators to the uptake of new medicines into clinical practice: a systematic review. BMC Health Serv Res 2021; 21 (1):1198. 7. Girlanda F, Fiedler I, Becker T, Barbui C, Koesters M. The evidence-practice gap in specialist mental healthcare: systematic review and meta-analysis of guideline implementation studies. Br J Psychiatr 2017; 210 (1):24–30. 8. Academic Press, Stevenson KA, Peyre SE, Noyes KI, Berk BS. Wartman SA. Chapter 20: the changing delivery of patient care. The transformation of academic health centers 2015; 203–211 p. 9. Implementation Science. About [internet]. Springer; 2023 [cited 2023 Oct 20]. Available from: https://implementationscience.biomedcentral.com/about. 10. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol 2015; 3 (1):32. 11. Department of Veterans Affairs (US), Goodrich DE, Miake-Lye I, Braganza MZ, Wawrin N, Kilbourne AM. The QUERI roadmap for implementation and quality improvement. 2020. 12. Catalano RF, Kellogg E. Fostering healthy mental, emotional, and behavioral development in children and youth: a national agenda. J Adolesc Health 2020; 66 (3):265–267. 13. McConnachie A, Ellis DA, Wilson P, McQueenie R, Williamson AE. Quantifying unmet need in General Practice: a retrospective cohort study of administrative data. BMJ Open 2023; 13 (9):e068720. 14. Flynn R, Brooks SP, Thomson D, Zimmermann GL, Johnson D, Wasylak T. An implementation science laboratory as one approach to whole system improvement: a Canadian healthcare perspective. Int J Environ Res Public Health 2021; 18 (23): 15. Jordan Z, Lockwood C, Munn Z, Aromataris E. The updated Joanna Briggs Institute Model of Evidence-Based Healthcare. Int J Evid Based Healthc 2019; 17 (1):58–71. 16. Porritt K, McArthur A, Lockwood C, Munn Z, editors. JBI Manual for Evidence Implementation [internet]. JBI; 2020 [cited 2023 Oct 15]. Available from: https://implementationmanual.jbi.global. 17. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ 2017; 356:i6795. 18. Woods-Hill CZ, Wolfe H, Malone S, Steffen KM, Agulnik A, Flaherty BF, et al. Implementation science research in pediatric critical care medicine. Pediatr Crit Care Med 2023; 24 (11):943–951.

留言 (0)

沒有登入
gif