Cross-Modal Hashing Retrieval Based on Density Clustering

Cross-modal hashing retrieval methods have attracted much attention for their effectiveness and efficiency. However, most of the existing hashing methods have the problem of how to precisely learn potential correlations between different modalities from binary codes with minimal loss. In addition, s...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiaojun Qi, Xianhua Zeng, Hongmei Tang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9026921/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cross-modal hashing retrieval methods have attracted much attention for their effectiveness and efficiency. However, most of the existing hashing methods have the problem of how to precisely learn potential correlations between different modalities from binary codes with minimal loss. In addition, solving binary codes in different modalities is an NP-hard problem. To overcome these challenges, we initially propose a novel adaptive fast cross-modal hashing retrieval method under the inspiration of DBSCAN clustering algorithm, named Cross-modal Hashing Retrieval Based on Density Clustering (DCCH). DCCH utilizes the global density correlation between different modalities to select representative instances to replace the entire data precisely. Furthermore, DCCH excludes the adverse effects of noise points and leverages the discrete optimization process to obtain hash functions. The extensive experiments show that DCCH is superior to other state-of-the-art cross-modal methods on three benchmark bimodal datasets, i.e., Wiki, MIRFlickr and NUS-WIDE. Therefore, the experimental results also prove that our method DCCH is comparatively usable and efficient.
ISSN:2169-3536