1. An Approach Based on Multiple Representations and Multiple Queries for Invariant Image Retrieval.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Qiu, Guoping, Leung, Clement, Xue, Xiangyang, Laurini, Robert, and Abbadeni, Noureddine
- Abstract
In this paper, we present a multiple representations and multiple queries approach to tackle the problem of invariance in the framework of content-based image retrieval (CBIR), especially in the case of texture. This approach, rather than considering invariance at the representation level, considers it at the query level. We use two models to represent texture visual content, namely the autoregressive model and a perceptual model based on a set of perceptual features. The perceptual model is used with two viewpoints: the original images viewpoint and the autocovariance function viewpoint. After a brief presentation and discussion of these multiple representation models / viewpoints, which are not invariant with respect to geometric and photometric transformations, we present the invariant texture retrieval algorithm, which is based on multiple models / viewpoints and multiple queries approach and consists in two levels of results fusion (merging): 1. The first level consists in merging results returned by the different models / viewpoints (representations) for the same query in one results list using a linear results fusion model; 2. The second level consists in merging each fused list of different queries into a unique fused list using a round robin fusion scheme. Experimentations show promising results. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF