SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Kearney S, Leung L, Joyce A, Ollis D, Green C. Health Promot. J. Austr. 2016; 27(3): 230-235.

Copyright

(Copyright © 2016, Australian Health Promotion Association, Publisher CAIRO Publishing)

DOI

10.1071/HE16046

PMID

27719735

Abstract

Issue addressed: Our Watch led a complex 12-month evaluation of a whole school approach to Respectful Relationships Education (RRE) implemented in 19 schools. RRE is an emerging field aimed at preventing gender-based violence. This paper will illustrate how from an implementation science perspective, the evaluation was a critical element in the change process at both a school and policy level.

METHODS: Using several conceptual approaches from systems science, the evaluation sought to examine how the multiple systems layers - student, teacher, school, community and government - interacted and influenced each other. A distinguishing feature of the evaluation included 'feedback loops'; that is, evaluation data was provided to participants as it became available. Evaluation tools included a combination of standardised surveys (with pre- and post-intervention data provided to schools via individualised reports), reflection tools, regular reflection interviews and summative focus groups.

RESULTS: Data was shared during implementation with project staff, department staff and schools to support continuous improvement at these multiple systems levels. In complex settings, implementation can vary according to context; and the impact of evaluation processes, tools and findings differed across the schools. Interviews and focus groups conducted at the end of the project illustrated which of these methods were instrumental in motivating change and engaging stakeholders at both a school and departmental level and why.

CONCLUSION: The evaluation methods were a critical component of the pilot's approach, helping to shape implementation through data feedback loops and reflective practice for ongoing, responsive and continuous improvement. Future health promotion research on complex interventions needs to examine how the evaluation itself is influencing implementation.So what?: The pilot has demonstrated that the evaluation, including feedback loops to inform project activity, were an asset to implementation. This has implications for other health promotion activities, where evaluation tools could be utilised to enhance, rather than simply measure, an intervention. The findings are relevant to a range of health promotion research activities because they demonstrate the importance of meta-evaluation techniques that seek to understand how the evaluation itself was influencing implementation and outcomes.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print