1. Doubled coupling for image emotion distribution learning.
- Author
-
Wu, Huiyan, Huang, Yonggang, and Nan, Guoshun
- Subjects
- *
AFFECTIVE forecasting (Psychology) , *EMOTIONS , *COUPLINGS (Gearing) , *SOCIAL network analysis , *GRAPH connectivity , *IMAGE analysis - Abstract
Image emotion prediction has a great impact on wide applications, such as social network analysis, advertising, and human–computer interaction. Recently, image emotion distribution learning (IEDL) has attracted increasing attention as it holds the potential to tackle the challenging emotion ambiguity problem for image emotion prediction. Existing efforts focus more on the emotion distribution learning with the assumption of independently identically distribution. However, we observe that the connections between objects in an image (e.g., butterfly and flower) and the connections between different images (e.g., the images taken in the same place), commonly exist in real-world datasets. Coupling information has been proved greatly helpful for many tasks, and also is crucial for image emotion analysis. Such observations motivate us to explore the above two coupling relations for better IEDL. With this in mind, we propose DoubledIEDL , a novel IEDL approach that consists of two sub-modules for object and image coupling learning, respectively. Specifically, our IEDL relies on a unified framework equipped with densely connected graph convolutional networks (DCGCN) for both coupling learning. The learning of our proposed framework has two stages: static stage and dynamic stage. In the first stage, a static graph is constructed to extract the shallow coupling information with DCGCN. Then, in the second stage, the deep coupling information is further mined via DCGCN on dynamically updated graphs in an iterative manner. The sub-modules for object and image coupling learning share this framework, but differ in the static graph constructing strategy. Extensive experiments on the two public benchmarks, FlickrLDL and TwitterLDL, demonstrate the effectiveness of the proposed DoubledIEDL, yielding significant improvement against previous state-of-the-art models. On FlickrLDL, CoupledIEDL achieves 0.8596 in C o s i n e and 0.4356 in Kullback–Leibler Divergence (K–L). On TwitterLDL, CoupledIEDL achieves 0.8717 in C o s i n e and 0.4705 in K–L. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF