Back to Search
Start Over
Multi-scale object retrieval via learning on graph from multimodal data.
- Source :
-
Neurocomputing . Sep2016, Vol. 207, p684-692. 9p. - Publication Year :
- 2016
-
Abstract
- Object retrieval has attracted much research attention in recent years. Confronting object retrieval, how to estimate the relevance among objects is a challenging task. In this paper, we focus on view-based object retrieval and propose a multi-scale object retrieval algorithm via learning on graph from multimodal data. In our work, shape features are extracted from each view of objects. The relevance among objects is formulated in a hypergraph structure, where the distance of different views in the feature space is employed to generate the connection in the hypergraph. To achieve better representation performance, we propose a multi-scale hypergraph structure to model object correlations. The learning on graph is conducted to estimate the optimal relevance among these objects, which are used for object retrieval. To evaluate the performance of the proposed method, we conduct experiments on the National Taiwan University dataset and the ETH dataset. Experimental results and comparisons with the state-of-the-art methods demonstrate the effectiveness of the proposed method. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 09252312
- Volume :
- 207
- Database :
- Academic Search Index
- Journal :
- Neurocomputing
- Publication Type :
- Academic Journal
- Accession number :
- 117373578
- Full Text :
- https://doi.org/10.1016/j.neucom.2016.05.053