Skip to Main Content

Innovation & Research Centre - Research, Knowledge Translation and Implementation Guide: Evaluation

Outlines the steps of evidence-based practice. Looks at how to find, evaluate and implement research to ensure best-quality care is provided.

The Metro North Evaluation Framework

This framework provides a set of steps to guide the development and delivery of robust evaluation of initiatives within Metro North Health

Step 1: Define the context and purpose of the evaluation

Describe Model of Care or Practice Change

Define the need, context, objectives, intended outcomes and resources utilised to develop and implement the model of care or practice change. This will help to focus the evaluation.

Determine primary users and purpose(s) for the evaluation – who and why?

  • What is the purpose of this evaluation?
  • Who needs the evaluation results?
  • Consider how the evaluation can provide relevant information in a timely manner and what decisions will the Accountable Officer be able to make about the evaluation?

 

Step 3: Conduct the evaluation

Collect and analyse data          

Baseline or comparative data is ideally collected prior to model of care/practice change commencing. Processes need to be in place for the recording and storage of data, in compliance with legislative and ethical requirements.

Data analysis is dependent on the type of data collected. Most evaluations are best suited to mixed methods (i.e. both quantitative and qualitative data).

Quantitative data analysis can be as simple as descriptive statistics (frequency, mean, median, mode), cost effectiveness analysis all the way to complex regression models requiring statistical support.

Qualitative data analysis requires the evaluator to closely examine the data to identify common themes. Seeking expertise in the design and analysis of qualitative data is highly recommended. 

Interpret findings (make judgements and recommendations as required)

Interpreting findings answers the key evaluation questions and contribute to the purpose of the evaluation. It may also include a recommendation.

E.g., The model of care/practice change was found to be highly effective in achieving intended outcomes…., provided value for money when compared to alternatives and resulted in a positive experience for staff and patients….. It is recommended that this model of care/practice change be continued with the following improvements….

Steps 4 & 5: Review and Report on the evaluation

Review and report on the evaluation findings

  • Determine if the evaluation met its purpose and the needs of the primary users
  • Examine the accuracy of findings and address any limitations
  • Report the findings. Suggested headings for the Evaluation report include:
    • Executive Summary; Background; Purpose; Evaluation Questions; Evaluation Design; Key Findings; Evaluation Quality Assessment

Step 2: Design the evaluation

The design of an evaluation is best planned at the same time as the model of care/practice change planning. This allows for the collection of baseline data.

Develop evaluation questions and identify evaluation domains:

Key evaluation questions should be developed by considering the purpose of the evaluation and what dimensions/aspects of the model are to be assessed. (Evaluation domains are listed in bold below)

Core Evaluation Questions (considered central to all evaluation in Metro North Health):

  • To what extent did the model of care/practice change reach its intended audience?
  • How effective was the model of care/practice change in achieving its intended outcomes?
  • What was the impact on the patient experience?
  • Was the model of care/practice change implemented as intended? If not, why not? (Implementation fidelity)
  • How sustainable is this model of care/practice change?

Additional exemplar evaluation questions (evaluation domains in bold):

  • How did the model of care/practice change address issues of access and equity?
  • How acceptable was the model of care/practice change to key stakeholders?  Acceptability
  • Was the uptake or adoption of the model of care as estimated?
  • To what extent does the programme address the identified problem? Appropriateness/Relevance
  • What are the costs of the service? Has the service been cost-effective?
  • Has your intervention achieved its objectives? Efficacy
  • Is the intervention a practical solution and likely to be used in the setting?   Feasibility
  • What overall impact did the intervention have on the service?
  • Did the intervention raise safety concerns or increased risks for the consumer? Safety
  • Are the stakeholders and consumers satisfied with the service or intervention? Quality

Select appropriate measures/indicators- how will the identified domains (in bold above) be measured?

  • Select specific and measurable indicator(s) for each domain
  • Determine the baseline/comparative data or agreed standards to compare against
  • Identify the data source and data collection method for each indicator/measure

Supporting Documents

© The State of Queensland (Queensland Health) 1996-2018