SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Brug J, Tak NI, Te Velde SJ. Health Promot. Int. 2011; 26(2): 244-254.

Affiliation

Department of Epidemiology and Biostatistics and the EMGO Institute for Health and Care Research, VU University Medical Center, Van der Boechorststraat 7, 1081 BT Amsterdam, the Netherlands.

Copyright

(Copyright © 2011, Oxford University Press)

DOI

10.1093/heapro/daq058

PMID

20739324

Abstract

Nationwide health promotion campaigns are an important part of government-funded health promotion efforts. Valid evaluation is important, but difficult because gold standard research designs are not applicable and the allocation of budget and time for evaluation is often very tight. In the Netherlands, Health Promotion Institutes (HPIs) are responsible for these campaigns. We conducted an exploratory study among the HPIs to gain better insight into goals, practices, conditions and perceived barriers regarding evaluation of these campaigns. Data were obtained through personal interviews with representatives of HPIs who had direct management responsibility for the evaluation of their campaigns. The HPIs typically made use of a pre-test-post-test design with single measurements before and after the campaign without a control group. In campaign preparations, HPIs used qualitative research to pre- and pilot-test some campaign materials, but true formative evaluation was rare. Besides, accountability to their sponsors, peers and the population at large, the most important reason to evaluate was to learn for future campaigns. In terms of the RE-AIM framework, evaluation was mostly restricted to Reach and Effects; hardly any evaluation of adoption, implementation or maintenance was reported. Budget and time constraints were reported as the main barriers for more extensive formative and effect evaluation. Evaluation of nationwide campaigns is standard procedure, but the applied research designs are weak, due to lack of time, budget and research methodology expertise. Next to additional budget and a longer-term planning, input from external experts regarding evaluation research designs are needed for evaluation improvement.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print