151. Learning Music Emotions via Quantum Convolutional Neural Network
- Author
-
Peng Zhang, Yang Liu, Yuexian Hou, Sheng-hua Zhong, Yan Liu, Gong Chen, and Jiannong Cao
- Subjects
Computational model ,business.industry ,Emotion classification ,Deep learning ,Speech recognition ,Perspective (graphical) ,Artificial intelligence ,business ,Psychology ,Convolutional neural network ,Quantum ,Task (project management) ,Abstraction (linguistics) - Abstract
Music can convey and evoke powerful emotions. But it is very challenging to recognize the music emotions accurately by computational models. The difficulty of the problem can exponentially increase when the music segments delivery multiple and complex emotions. This paper proposes a novel quantum convolutional neural network (QCNN) to learn music emotions. Inheriting the distinguished abstraction ability from deep learning, QCNN automatically extracts the music features that benefit emotion classification. The main contribution of this paper is that we utilize measurement postulate to simulate the human emotion awareness in music appreciation. Statistical experiments on the standard dataset shows that QCNN outperforms the classical algorithms as well as the state-of-the-art in the task of music emotion classification. Moreover, we provide demonstration experiment to explain the good performance of the proposed technique from the perspective of physics and psychology.
- Published
- 2017
- Full Text
- View/download PDF