Deep Semantic-Aware Proxy Hashing for Multi-Label Cross-Modal Retrieval

计算机科学 散列函数 语义鸿沟 人工智能 情态动词 数据挖掘 特征学习 模式识别(心理学) 情报检索 图像检索 图像(数学) 化学 计算机安全 高分子化学
作者
Yadong Huo,Qibing Qin,Jiangyan Dai,Lei Wang,Wenfeng Zhang,Lei Huang,Chengduan Wang
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers]
卷期号:34 (1): 576-589 被引量:66
标识
DOI:10.1109/tcsvt.2023.3285266
摘要

Deep hashing has attracted broad interest in cross-modal retrieval because of its low cost and efficient retrieval benefits. To capture the semantic information of raw samples and alleviate the semantic gap, supervised cross-modal hashing methods that utilize label information which could map raw samples from different modalities into a unified common space, are proposed. Although making great progress, existing deep cross-modal hashing methods are suffering from some problems, such as: 1) considering multi-label cross-modal retrieval, proxy-based methods ignore the data-to-data relations and fail to explore the combination of the different categories profoundly, which could lead to some samples without common categories being embedded in the vicinity; 2) for feature representation, image feature extractors containing multiple convolutional layers cannot fully obtain global information of images, which results in the generation of sub-optimal binary hash codes. In this paper, by extending the proxy-based mechanism to multi-label cross-modal retrieval, we propose a novel Deep Semantic-aware Proxy Hashing (DSPH) framework, which could embed multi-modal multi-label data into a uniform discrete space and capture fine-grained semantic relations between raw samples. Specifically, by learning multi-modal multi-label proxy terms and multi-modal irrelevant terms jointly, the semantic-aware proxy loss is designed to capture multi-label correlations and preserve the correct fine-grained similarity ranking among samples, alleviating inter-modal semantic gaps. In addition, for feature representation, two transformer encoders are proposed as backbone networks for images and text, respectively, in which the image transformer encoder is introduced to obtain global information of the input image by modeling long-range visual dependencies. We have conducted extensive experiments on three baseline multi-label datasets, and the experimental results show that our DSPH framework achieves better performance than state-of-the-art cross-modal hashing methods. The code for the implementation of our DSPH framework is available at https://github.com/QinLab-WFU/DSPH .
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
饼饼完成签到,获得积分10
刚刚
1秒前
工商第一发布了新的文献求助10
1秒前
jiyuan发布了新的文献求助10
2秒前
NexusExplorer应助慈祥的巧曼采纳,获得10
2秒前
英吉利25发布了新的文献求助10
2秒前
2秒前
烂漫岱周完成签到,获得积分10
3秒前
SciGPT应助ljx采纳,获得10
3秒前
Eileen完成签到,获得积分10
4秒前
破铜烂铁完成签到,获得积分10
4秒前
郑可馨完成签到 ,获得积分10
4秒前
4秒前
乐乐应助哇咔咔采纳,获得10
5秒前
5秒前
zbw完成签到 ,获得积分10
5秒前
李健的小迷弟应助三叁采纳,获得10
6秒前
6秒前
深情安青应助fdn采纳,获得10
6秒前
6秒前
7秒前
凉宫八月发布了新的文献求助10
7秒前
不想晚睡发布了新的文献求助10
8秒前
搜集达人应助梅梅美采纳,获得10
8秒前
Singularity应助Lixy采纳,获得10
9秒前
9秒前
9秒前
YOLO发布了新的文献求助30
9秒前
江河湖海关注了科研通微信公众号
10秒前
皮皮完成签到 ,获得积分10
10秒前
heris123完成签到,获得积分10
11秒前
归尘发布了新的文献求助10
11秒前
jojo发布了新的文献求助10
11秒前
11秒前
秦兴虎发布了新的文献求助30
11秒前
慕青应助刻苦的芝麻采纳,获得10
12秒前
英俊的铭应助李昀睿采纳,获得10
12秒前
汉堡包应助jiyuan采纳,获得10
12秒前
汤佳乐发布了新的文献求助10
13秒前
俏皮灵煌完成签到,获得积分10
13秒前
高分求助中
Elements of Propulsion: Gas Turbines and Rockets, Second Edition 1000
卤化钙钛矿人工突触的研究 1000
Engineering for calcareous sediments : proceedings of the International Conference on Calcareous Sediments, Perth 15-18 March 1988 / edited by R.J. Jewell, D.C. Andrews 1000
Wolffs Headache and Other Head Pain 9th Edition 1000
Continuing Syntax 1000
Signals, Systems, and Signal Processing 510
2026 Hospital Accreditation Standards 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6244174
求助须知:如何正确求助?哪些是违规求助? 8067467
关于积分的说明 16840429
捐赠科研通 5321550
什么是DOI,文献DOI怎么找? 2833543
邀请新用户注册赠送积分活动 1811225
关于科研通互助平台的介绍 1667135