SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Wang Z, Yan X, Jiang W, Sun M. Sensors (Basel) 2018; 18(12): s18124241.

Affiliation

Division of Intelligence and Computing, Tianjin University, Tianjin 300072, China. sunmeijun@tju.edu.cn.

Copyright

(Copyright © 2018, MDPI: Multidisciplinary Digital Publishing Institute)

DOI

10.3390/s18124241

PMID

30513936

Abstract

Movie highlights are composed of video segments that induce a steady increase of the audience's excitement. Automatic movie highlights' extraction plays an important role in content analysis, ranking, indexing, and trailer production. To address this challenging problem, previous work suggested a direct mapping from low-level features to high-level perceptual categories. However, they only considered the highlight as intense scenes, like fighting, shooting, and explosions. Many hidden highlights are ignored because their low-level features' values are too low. Driven by cognitive psychology analysis, combined top-down and bottom-up processing is utilized to derive the proposed two-way excitement model. Under the criteria of global sensitivity and local abnormality, middle-level features are extracted in excitement modeling to bridge the gap between the feature space and the high-level perceptual space. To validate the proposed approach, a group of well-known movies covering several typical types is employed. Quantitative assessment using the determined excitement levels has indicated that the proposed method produces promising results in movie highlights' extraction, even if the response in the low-level audio-visual feature space is low.

Keywords: Violence, Violent media, Violent Movies


Language: en

Keywords

affective computing; excitement modeling; movie exciting degree; movie highlights’ extraction

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print