Evaluating patient education concepts

What are the impacts?
Evaluating Steno’s own programs

The evaluation challenge
National and international guidelines state that education and self-management support to people living with diabetes is a necessary step for high-quality care. However, studies show that most educational programs are not able to effectively target self-management behaviour (link to National Board of Health report 2009).

Evaluating complex interventions such as patient education programs is not easy. Complex interventions typically contain a number of interacting components. Thus, an evaluation must make it possible to assess whether potential effects are actually a result of the education program.

Likewise, evaluation studies should be able to assess whether a potential lack of effect is caused by implementation failure or by wrong assumptions about mechanisms for effect. Therefore, a good theoretical understanding of mechanisms of effect is necessary to identify and strengthen weak links in the causal chain.

Further, one must conduct a thorough process evaluation to identify implementation problems. Finally, a range of measures and methods should be utilised to identify intended and unintended effects and consequences.

Evaluating Steno’s own programs
Overall, we aim to evaluate patient education programs developed in Steno’s Patient Education Research Group. This includes an evaluation of the effects of the dialogue-based patient education tool – NExt Education (NEED), as well as the consultation program “Empowerment, Motivation and Medical Adherence” (EMMA). A feasibility study of NEED was conducted in 2012. The results revealed that the tools to a large extent create dialogue and participation.

Further, a realist evaluation was conducted at the Diabetes School at the University Hospital of Odense in Denmark in 2011-2012 (link to report). Realist evaluation is a theory-driven evaluation method used in evaluating complex programs. This realist technique acknowledges that intervention programs do not necessarily work for everyone, since people are different and are embedded in different contexts.

Design and method
Conducting evaluations of complex patient education programs implemented in the complex Danish health care system requires frameworks that can encompass variations in the settings and in the implementation and use of the concepts.

During previous projects we have found realistist evaluation to be highly appropriate in an overall evaluation framework. They use both qualitative and quantitative research methods to collect and analyse data. Data include participant observations, qualitative interviews, surveys, clinical data and “beep measurements,” a method to assess how much the educator talk and how much the patients participate described by Skinner et al. 2008.
We will also conduct a health economic evaluation assessing the cost-effectiveness of the patient education programs.

We will evaluate NEED using a non-randomised controlled trial with before- and after-intervention measurements or a control group. The primary outcome measures will be self-management behaviours, quality of life, patient activation and empowerment and HbA1c.

Target group
These research project targets people with diabetes and health care professionals/educators and policy makers working in the patient education area.

Expected outcome

The process evaluations will show to what extent the programs are implemented, how the programs work in practice and are used in different settings. It will show to what degree the programs influence participation and dialogue in the contact between people diabetes and educators.

Furthermore, we will report the effects on clinical outcome measures (e.g. HbA1c), self-management behaviours (e.g. medication adherence, physical activity, and dietary habits), and psychological effects such as quality of life and meaningfulness. Finally, the cost-effectiveness of the programmes will be important to the planning of future patient education.

The results will provide new insights into a wide range of short and long term effects of participatory and dialogue-based approaches to patient education. Finally, the results will guide further development of patient education including implementation and evaluation.

Collaborators include the following Danish municipalities

Aalborg
Aarhus
Albertslund
Gladsaxe
København (Copenhagen)
Lemvig
Morsø
Roskilde
Skive
Sønderborg
Vejle
Viborg

Other collaborators in Denmark include
The Hospital Unit in Horsens
Odense University Hospital
Diabetes specialist clinics (to be confirmed)


 

Inspirational articles

Campell M, Fitzpatrick R, Haines A, Kinmonth AL, Sandercock P, Spiegelhalter D, Tyrer P. Framework for design and evaluation of complex interventions to improve health. BMJ 2000;321:694-6.

Craig P, Dieppe P, Macintyre S, Mitchie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ 2008;337:979-983.

National Board of Health. Patient Education – A Health Technology Assessment. Copenhagen 2009.

Marchal B, van Belle S, van Olmen J, Hoerée T, Kegels G. Is realist evaluation keeping its promise? A review of published empirical studies in the field of health systems research. Evaluation 2012;18(2):192-212.
 
Skinner TC, Carey ME, Cradock S, Dalosso HM, Daly H, Davies MJ, Doherty Y, Heller S, Khunti K, Oliver L. ”Educator talk” and patient change: some insights from the DESMOND (Diabetes Education and Self Management for Ongoing and Newly Diagnosed) randomized controlled trial. Diabetic Medicine 2008;25:1117-1120.

Schwennesen N, Folker AP, Stenov V, Duun R, Willaing I. Mellem idealer og realiteter – En virkningsevaluering af Diabetesskolen på Endokrinologisk afdeling M, Odense Universitetshospital.

Patient Education Research, Steno Health Promotion Center, Steno Disabetes Center Januar 2013

Sidst opdateret 14-02-2017

Responsible

Regitze Pals
Research assistant

Annemarie Varming
Researcher

Kasper Olesen
Researcher

Diabetes Management Research