SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Ergai A, Cohen T, Sharp J, Wiegmann D, Gramopadhye A, Shappell S. Safety Sci. 2016; 82: 393-398.

Copyright

(Copyright © 2016, Elsevier Publishing)

DOI

10.1016/j.ssci.2015.09.028

PMID

unavailable

Abstract

The Human Factors Analysis and Classification System (HFACS) is a framework for classifying and analyzing human factors associated with accidents and incidents. The purpose of the present study was to examine the inter- and intra-rater reliability of the HFACS data classification process.

Methods
A total 125 safety professionals from a variety of industries were recruited from a series of two-day HFACS training workshops. Participants classified 95 real-world causal factors (five causal factors for each of the 19 HFACS categories) extracted from a variety of industrial accidents. Inter-rater reliability of the HFACS coding process was evaluated by comparing performance across participants immediately following training and intra-rater reliability was evaluated by having the same participants repeat the coding process following a two-week delay.

Results
Krippendorff's Alpha was used to evaluate the reliability of the coding process across the various HFACS levels and categories.

RESULTS revealed the HFACS taxonomy to be reliable in terms of inter- and intra-rater reliability, with the latter producing slightly higher Alpha values.

Conclusion
Results support the inter- and intra-rater reliability of the HFACS framework but also reveal additional opportunities for improving HFACS training and implementation.

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print