Back to Search Start Over

Automatic image annotation using tag-related random search over visual neighbors

Authors :
Mingqing Hu
Jianmin Wang
Jiaguang Sun
Guiguang Ding
Zijia Lin
Source :
CIKM
Publication Year :
2012
Publisher :
ACM, 2012.

Abstract

In this paper, we propose a novel image auto-annotation model using tag-related random search over range-constrained visual neighbors of the to-be-annotated image. The proposed model, termed as TagSearcher, observes that the annotating performances of many previous visual-neighbor-based models are generally sensitive to the quantity setting of visual neighbors, and the probabilities for visual neighbors to be selected is better to be tag-dependent, meaning that each candidate tag can have its own trustworthy part of visual neighbors for score prediction. And thus TagSearcher uses a constrained range rather than an identical and fixed number of visual neighbors for auto-annotation. By performing a novel tag-related random search process over the graphical model made up of range-constrained visual neighbors, TagSearcher can find the trustworthy part for each candidate tag, and further utilize both visual similarities and tag correlations for score prediction. With the range constraint for visual neighbors and the tag-related random search process, TagSearcher can not only achieve satisfactory annotating performances, but also reduce the performance sensitivity. Experiments conducted on benchmark Corel5k well demonstrate its rationality and effectiveness.

Details

Database :
OpenAIRE
Journal :
Proceedings of the 21st ACM international conference on Information and knowledge management
Accession number :
edsair.doi...........676659adeecdf8639fb0b9740d5a118b