SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Kim J, Nirjhar EH, Lee H, Chaspari T, Lee C, Ham Y, Winslow JF, Ahn CR. Sci. Rep. 2023; 13(1): e5940.

Copyright

(Copyright © 2023, Nature Publishing Group)

DOI

10.1038/s41598-023-33132-z

PMID

37046023

Abstract

Biosignals from wearable sensors have shown great potential for capturing environmental distress that pedestrians experience from negative stimuli (e.g., abandoned houses, poorly maintained sidewalks, graffiti, and so forth). This physiological monitoring approach in an ambulatory setting can mitigate the subjectivity and reliability concerns of traditional self-reported surveys and field audits. However, to date, most prior work has been conducted in a controlled setting and there has been little investigation into utilizing biosignals captured in real-life settings. This research examines the usability of biosignals (electrodermal activity, gait patterns, and heart rate) acquired from real-life settings to capture the environmental distress experienced by pedestrians. We collected and analyzed geocoded biosignals and self-reported stimuli information in real-life settings. Data was analyzed using spatial methods with statistical and machine learning models.

RESULTS show that the machine learning algorithm predicted location-based collective distress of pedestrians with 80% accuracy, showing statistical associations between biosignals and the self-reported stimuli. This method is expected to advance our ability to sense and react to not only built environmental issues but also urban dynamics and emergent events, which together will open valuable new opportunities to integrate human biological and physiological data streams into future built environments and/or walkability assessment applications.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print