The workshop was conceived by a number of groups undertaking complex intervention research that had identified the need to draw on collective expertise in developing process evaluation. Workshop participants, who were predominantly public health researchers and policy makers, were strongly in favour of the development of guidance to assist them in their research. There was also consensus that funders and reviewers of grant applications would benefit from guidance to assist peer review. An integrated multiple component evaluation was conducted in order to describe implementation and assist in explaining why the intervention and its components were, or were not, successful. Process evaluations examine what the intervention comprises and how it is delivered to target participants;154 they are designed to evaluate fidelity and provide explanatory evidence around trial outcomes. Are implemented as intended by the program developers’.171 At its simplest level, fidelity is measured in terms of adherence to the intervention (i.e. content, coverage, frequency and duration154).
Studies were eligible for inclusion if they self-identified as using a systems and/or complexity perspective at any stage of the evaluative process, including during the design, data collection, analysis, or interpretation phases. Studies that covered topics not included in the aforementioned list were considered if they concerned population health; decisions in these instances were made between 3 reviewers. Studies were only included if they reported empirical findings of a process evaluation; protocols and discussion pieces describing evaluations without presenting results were excluded. Process evaluations alongside outcome evaluations were eligible for inclusion, although our analysis focused solely on the process evaluation component. Evaluations employing mixed methods (wherein qualitative data were integrated into the assessment of the intervention alongside other methods) were included, as long as there was a substantive component that generated and analyzed qualitative data.
Methods
The choice of a framework and method should be guided by the key questions that need to be answered to understand the implementation process and the skills and preferred methodological approaches of the researchers. You may identify multiple research questions arising from the logic model and only have limited resources to answer them. Also note that sometimes inexperienced researchers design an evaluation around methods or around data that will be easy to collect, rather than the main research questions for the study. If you do this you may collect a lot of data but may not answer the questions you need answers to. It is important to prioritise the main research questions for a process evaluation early on.
- Proponents of explicitly using complexity theory within qualitative designs argue doing so “has potential to capture and understand complex dynamics that might otherwise be unexplored” [27 p. 3].
- Similarly, findings from the ASHA program in India emphasized the need for culturally appropriate training materials, delivered using interactive and innovative methods.
- For example, the ways in which an intervention is delivered may have an effect on participant or community response, and the acceptability of the intervention.
- Process evaluations of public health interventions may benefit from an explicit adoption of a complex systems perspective.
- By bringing an explicitly relational focus to the evaluation design and placing the wider context in the foreground of the analysis [24], a complex system approach to a process evaluation may help to make sense of intervention mechanisms within a real-world context.
Theorise what your intervention is, how you think it will operate and what impact it should have on the problem it is trying to solve. In process evaluation, the underlying theory of how the intervention works should form the basis for the evaluation. The MRC process evaluation guidance [footnote 2] recommends using a theory-based approach, in which the underpinning theory of an intervention provides a structure for the process evaluation design, data collection and analysis. Process evaluations typically examine aspects related to delivery and implementation processes such as fidelity (that is, was it delivered as planned?), dose and reach. Process evaluations can be independent studies or conducted simultaneously with outcome evaluations such as randomised controlled trials.
08. Outcome Evaluation
Therefore, evaluations are based on these Context-Mechanism-Outcome configurations to explain ‘what works for whom in what circumstances and in what respects, and how? In a review by Rogers et al., the authors highlight that this approach of basing an evaluation on a logic, or causal, model is not new and has been recommended by evaluators since the 1960s. They refer to it as the ‘program theory evaluation’ and describe its historical development, the current variations in theory and practice, and discuss the problems it poses. The use of mixed method designs in implementation research has been increasingly utilised to develop an understanding of and overcome barriers to implementation. The following literature review examines 22 studies of mental health services research between 2005 and 2009 that applied mixed methods in their implementation research and reported on how these methods were used and reasons for their use. These documents provide a practical guide to carrying out a process evaluation on complex interventions.
This necessitates formative research and establishment of trusting relationship to shape mutual commitment to action between researchers and local communities. Many research projects experience delay in the formative and implementation phases of their projects. In Phase 1, evaluators would begin to make sense of and document the “local rules” that govern both the intervention and the system, including the rules that govern how different system elements interact and relate to each other and how the intervention operates and relates to different parts of the system. In undertaking Phase 1, evaluators would draw on concepts that are most closely aligned with systems thinking (the left-hand side of Fig 2 and first half of Table 2) and use these to structure the initial data collection and analysis.
Interview summaries were initially constructed (interviews could include up to three respondents). Research nurses were asked to submit a copy of the 3-day diary for all participants for whom one was recorded, i.e. participants who were incontinent at baseline or whose catheter was removed before discharge. Participants catheterised throughout their stay were not eligible to complete the diary. Each diary was assessed using a filtering system, with data input for an individual diary terminated at any stage if it failed to achieve that stage’s key quality indicator. If evaluation is dismissed by the ED as an annoyance forced upon the organization from the outside, staff will feel the same way. The executive director must also understand the level of evaluation that is appropriate.
The concepts are presented as belonging along a continuum, with systems thinking on the far left-hand side and complexity science on the far right-hand side. Moving along the spectrum, from systems thinking to complexity science, represents a movement from static to dynamic. Key systems thinking concepts, on the left-hand side of the figure, are the structure of a system, its elements, and the relationships between them. Moving toward the middle of the figure, concepts from complexity science are introduced, which include attributes and dimensions of an intervention, and then a system undergoing change. The far right-hand side of the figure includes concepts that feature within the complexity sciences to computationally model complex systems in order to simulate and predict behavior and outcomes and to understand an evolving system.
Evaluation can let you know the areas your program needs to improve to make a greater impact on your participants. Data collection should be planned and co-ordinated so that the research process is efficient. For example, it may be possible to collect both qualitative and quantitative data from participants during the systematic testing same research visit/encounter, or make use of routinely collected data, reducing data collection costs. Upon completion of your program, or the intermediate steps along the way, your evaluation efforts will be designed to examine long term outcomes and impacts, and summarize the overall performance of your program.
However, the process evaluation also highlighted the fact that the projects had resulted in increased research capacity among government and research institutions that participated in the study. In addition, new government policies for salt were being integrated into the new Food and Nutrition security strategy in Fiji, while in Samoa government proposals for taxation of packaged foods high in sugar and salt, were being considered. The seven process evaluations were from low- and middle-income countries (LMICs) that had either completed or nearly completed their process evaluations including Fiji and Samoa, South Africa, Kenya, Peru, India, Sri Lanka, Tanzania and indigenous communities in Canada, (Table 1).
0 Comments