SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Volpe RJ, Matta M, Briesch AM, Owens JS. J. Sch. Psychol. 2023; 101: e101251.

Copyright

(Copyright © 2023, Society for the Study of School Psychology, Publisher Elsevier Publishing)

DOI

10.1016/j.jsp.2023.101251

PMID

37951664

Abstract

Due to their promise as a feasible tool for evaluating the effects of school-based interventions, Direct Behavior Ratings (DBR) have received much research attention over the past 2 decades. Although DBR methodology has demonstrated much promise, favorable psychometric characteristics only have been demonstrated for tools measuring a small number of constructs. Likewise, although a variety of methods of DBR have been proposed, most extant studies have focused on the use of single-item methods. The present study examined the dependability of four methods of formative behavioral assessment (i.e., single-item and multi-item ratings administered either daily [DBR] or weekly [formative behavior rating measures or FBRM]) across eight psychological constructs (i.e., interpersonal skills, academic engagement, organizational skills, disruptive behavior, oppositional behavior, interpersonal conflict, anxious depressed, and social withdrawal). School-based professionals (N = 91; i.e., teachers, paraprofessionals, and intervention specialists) each rated one student across all eight constructs after being assigned to one of the four assessment conditions. Dependability estimates varied substantially across methods and constructs (range = 0.75-0.96), although findings of the present study support the use of the broad set of formative assessment tools evaluated.


Language: en

Keywords

Behavioral assessment; Direct behavior rating; Formative assessment; Generalizability theory; Teacher ratings

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print