Back to Search Start Over

Multi-Hot Compact Network Embedding

Authors :
Li, Chaozhuo
Wang, Senzhang
Yu, Philip S.
Li, Zhoujun
Source :
Published in CIKM 2019
Publication Year :
2019

Abstract

Network embedding, as a promising way of the network representation learning, is capable of supporting various subsequent network mining and analysis tasks, and has attracted growing research interests recently. Traditional approaches assign each node with an independent continuous vector, which will cause huge memory overhead for large networks. In this paper we propose a novel multi-hot compact embedding strategy to effectively reduce memory cost by learning partially shared embeddings. The insight is that a node embedding vector is composed of several basis vectors, which can significantly reduce the number of continuous vectors while maintain similar data representation ability. Specifically, we propose a MCNE model to learn compact embeddings from pre-learned node features. A novel component named compressor is integrated into MCNE to tackle the challenge that popular back-propagation optimization cannot propagate through discrete samples. We further propose an end-to-end model MCNE$_{t}$ to learn compact embeddings from the input network directly. Empirically, we evaluate the proposed models over three real network datasets, and the results demonstrate that our proposals can save about 90\% of memory cost of network embeddings without significantly performance decline.

Details

Database :
arXiv
Journal :
Published in CIKM 2019
Publication Type :
Report
Accession number :
edsarx.1903.03213
Document Type :
Working Paper