Back to Search Start Over

TAMFN: Time-Aware Attention Multimodal Fusion Network for Depression Detection.

Authors :
Zhou, Li
Liu, Zhenyu
Shangguan, Zixuan
Yuan, Xiaoyan
Li, Yutong
Hu, Bin
Source :
IEEE Transactions on Neural Systems & Rehabilitation Engineering; 2023, Vol. 31, p669-679, 11p
Publication Year :
2023

Abstract

In recent years, with the widespread popularity of the Internet, social media has become an indispensable part of people’s lives. People regard online social media as an essential tool for interaction and communication. Due to the convenience of data acquisition from social media, mental health research on social media has received a lot of attention. The early detection of psychological disorder based on social media can help prevent further deterioration in at-risk people. In this paper, depression detection is performed based on non-verbal (acoustics and visual) behaviors of vlog. We propose a time-aware attention-based multimodal fusion depression detection network (TAMFN) to mine and fuse the multimodal features fully. The TAMFN model is constructed by a temporal convolutional network with the global information (GTCN), an intermodal feature extraction (IFE) module, and a time-aware attention multimodal fusion (TAMF) module. The GTCN model captures more temporal behavior information by combining local and global temporal information. The IFE module extracts the early interaction information between modalities to enrich the feature representation. The TAMF module guides the multimodal feature fusion by mining the temporal importance between different modalities. Our experiments are carried out on D-Vlog dataset, and the comparative experimental results report that our proposed TAMFN outperforms all benchmark models, indicating the effectiveness of the proposed TAMFN model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15344320
Volume :
31
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Systems & Rehabilitation Engineering
Publication Type :
Academic Journal
Accession number :
161715037
Full Text :
https://doi.org/10.1109/TNSRE.2022.3224135