SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Gardner AJ, Levi CR, Iverson GL. Sports Med. Open 2017; 3(1): 26.

Affiliation

Home Base, A Red Sox Foundation and Massachusetts General Hospital Program, Boston, MA, USA.

Copyright

(Copyright © 2017, Holtzbrinck Springer Nature Publishing Group)

DOI

10.1186/s40798-017-0093-0

PMID

28710723

Abstract

BACKGROUND: Several professional contact and collision sports have recently introduced the use of sideline video review for club medical staff to help identify and manage concussions. As such, reviewing video footage on the sideline has become increasingly relied upon to assist with improving the identification of possible injury. However, as yet, a standardized method for reviewing such video footage in rugby league has not been published. The aim of this study is to evaluate whether independent raters reliably agreed on the injury characterization when using a standardized observational instrument to record video footage of National Rugby League (NRL) concussions.

METHODS: Video footage of 25 concussions were randomly selected from a pool of 80 medically diagnosed concussions from the 2013-2014 NRL seasons. Four raters (two naïve and two expert) independently viewed video footage of 25 NRL concussions and completed the Observational Review and Analysis of Concussion form for the purpose of this inter-rater reliability study. The inter-rater reliability was calculated using Cohen's kappa (κ) and intra-class correlation (ICC) statistics. The two naïve raters and the two expert raters were compared with one another separately.

RESULTS: A considerable number of components for the naïve and expert raters had almost perfect agreement (κ or ICC value ≥ 0.9), 9 of 22 (41%) components for naïve raters and 21 of 22 (95%) components for expert raters. For the concussion signs, however, the majority of the rating agreement was moderate (κ value 0.6-0.79); both the naïve and expert raters had 4 of 6 (67%) concussion signs with moderate agreement. The most difficult concussion sign to achieve agreement on was blank or vacant stare, which had weak (κ value 0.4-0.59) agreement for both naïve and expert raters.

CONCLUSIONS: There appears to be value in expert raters, but less value for naive raters, in using the new Observational Review and Analysis of Concussion (ORAC) Form. The ORAC Form has high inter-rater agreement for most data elements, and it can be used by expert raters evaluating video footage of possible concussion in the NRL.


Language: en

Keywords

Concussion; Injury management; Video analysis

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print