SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Luk JW, Pruitt LD, Smolenski DJ, Tucker J, Workman DE, Belsher BE. J. Clin. Psychol. (Hoboken) 2021; ePub(ePub): ePub.

Copyright

(Copyright © 2021, John Wiley and Sons)

DOI

10.1002/jclp.23202

PMID

unavailable

Abstract

Advances in artificial intelligence and machine learning have fueled growing interest in the application of predictive analytics to identify high-risk suicidal patients. Such application will require the aggregation of large-scale, sensitive patient data to help inform complex and potentially stigmatizing health care decisions. This paper provides a description of how suicide prediction is uniquely difficult by comparing it to nonmedical (weather and traffic forecasting) and medical predictions (cancer and human immunodeficiency virus risk), followed by clinical and ethical challenges presented within a risk-benefit conceptual framework. Because the misidentification of suicide risk may be associated with unintended negative consequences, clinicians and policymakers need to carefully weigh the risks and benefits of using suicide predictive analytics across health care populations. Practical recommendations are provided to strengthen the protection of patient rights and enhance the clinical utility of suicide predictive analytics tools.


Language: en

Keywords

suicide; informed consent; machine learning; ethics; artificial intelligence; big data

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print