Back to Search Start Over

Bit Reduction for Locality-Sensitive Hashing

Authors :
Liu, Huawen
Zhou, Wenhua
Zhang, Hong
Li, Gang
Zhang, Shichao
Li, Xuelong
Source :
IEEE Transactions on Neural Networks and Learning Systems; September 2024, Vol. 35 Issue: 9 p12470-12481, 12p
Publication Year :
2024

Abstract

Locality-sensitive hashing (LSH) has gained ever-increasing popularity in similarity search for large-scale data. It has competitive search performance when the number of generated hash bits is large, reversely bringing adverse dilemmas for its wide applications. The first purpose of this work is to introduce a novel hash bit reduction schema for hashing techniques to derive shorter binary codes, which has not yet received sufficient concerns. To briefly show how the reduction schema works, the second purpose is to present an effective bit reduction method for LSH under the reduction schema. Specifically, after the hash bits are generated by LSH, they will be put into bit pool as candidates. Then mutual information and data labels are exploited to measure the correlation and structural properties between the hash bits, respectively. Eventually, highly correlated and redundant hash bits can be distinguished and then removed accordingly, without deteriorating the performance greatly. The advantages of our reduction method include that it can not only reduce the number of hash bits effectively but also boost retrieval performance of LSH, making it more appealing and practical in real-world applications. Comprehensive experiments were conducted on three public real-world datasets. The experimental results with representative bit selection methods and the state-of-the-art hashing algorithms demonstrate that the proposed method has encouraging and competitive performance.

Details

Language :
English
ISSN :
2162237x and 21622388
Volume :
35
Issue :
9
Database :
Supplemental Index
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Publication Type :
Periodical
Accession number :
ejs67330638
Full Text :
https://doi.org/10.1109/TNNLS.2023.3263195