A Web Knowledge-Driven Multimodal Retrieval Method in Computational Social Systems: Unsupervised and Robust Graph Convolutional Hashing

计算机科学 卷积神经网络 散列函数 人工智能 WordNet公司 图形 情报检索 聚类分析 机器学习 理论计算机科学 程序设计语言
作者
Youxiang Duan,Ning Chen,Ali Kashif Bashir,Mohammad Dahman Alshehri,Lei Liu,Peiying Zhang,Keping Yu
出处
期刊:IEEE Transactions on Computational Social Systems [Institute of Electrical and Electronics Engineers]
卷期号:11 (3): 3146-3156 被引量:13
标识
DOI:10.1109/tcss.2022.3216621
摘要

Multimodal retrieval has received widespread consideration since it can commendably provide massive related data support for the development of computational social systems (CSSs). However, the existing works still face the following challenges: 1) rely on the tedious manual marking process when extended to CSS, which not only introduces subjective errors but also consumes abundant time and labor costs; 2) only using strongly aligned data for training, lacks concern for the adjacency information, which makes the poor robustness and semantic heterogeneity gap difficult to be effectively fit; and 3) mapping features into real-valued forms, which leads to the characteristics of high storage and low retrieval efficiency. To address these issues in turn, we have designed a multimodal retrieval framework based on web-knowledge-driven, called unsupervised and robust graph convolutional hashing (URGCH). The specific implementations are as follows: first, a " secondary semantic self-fusion " approach is proposed, which mainly extracts semantic-rich features through pretrained neural networks, constructs the joint semantic matrix through semantic fusion, and eliminates the process of manual marking; second, a " adaptive computing " approach is designed to construct enhanced semantic graph features through the knowledge-infused of neighborhoods and uses graph convolutional networks for knowledge fusion coding, which enables URGCH to sufficiently fit the semantic modality gap while obtaining satisfactory robustness features; Third, combined with hash learning, the multimodality data are mapped into the form of binary code, which reduces storage requirements and improves retrieval efficiency. Eventually, we perform plentiful experiments on the web dataset. The results evidence that URGCH exceeds other baselines about $1\%$ $3.7\%$ in mean average precisions (MAPs), displays superior performance in all the aspects, and can meaningfully provide multimodal data retrieval services to CSS.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
喜多发布了新的文献求助10
刚刚
林瑶发布了新的文献求助10
1秒前
1秒前
SciGPT应助lili采纳,获得10
2秒前
嘻嘻哈哈关注了科研通微信公众号
3秒前
aaaabc完成签到,获得积分10
3秒前
4秒前
研友_LMpo68完成签到 ,获得积分0
4秒前
善学以致用应助神勇秋白采纳,获得10
5秒前
6秒前
7秒前
8秒前
zombie完成签到,获得积分10
8秒前
Zhao0112发布了新的文献求助10
9秒前
MchemG应助lili采纳,获得10
10秒前
唯梦发布了新的文献求助10
11秒前
感动水卉发布了新的文献求助10
12秒前
FashionBoy应助科研通管家采纳,获得10
12秒前
完美世界应助神勇秋白采纳,获得30
12秒前
顾矜应助科研通管家采纳,获得30
12秒前
12秒前
李爱国应助科研通管家采纳,获得10
12秒前
CipherSage应助科研通管家采纳,获得10
12秒前
英俊的铭应助科研通管家采纳,获得30
12秒前
浮游应助科研通管家采纳,获得10
12秒前
科研通AI6应助科研通管家采纳,获得10
13秒前
NexusExplorer应助科研通管家采纳,获得10
13秒前
13秒前
13秒前
愉快向彤完成签到 ,获得积分10
14秒前
cafu发布了新的文献求助10
15秒前
16秒前
安详绿草完成签到,获得积分10
16秒前
李里黎完成签到 ,获得积分10
16秒前
草叶叶发布了新的文献求助10
17秒前
聂难敌发布了新的文献求助10
17秒前
18秒前
小马甲应助感动水卉采纳,获得10
19秒前
19秒前
21秒前
高分求助中
Encyclopedia of Quaternary Science Third edition 2025 12000
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Social Work Ethics Casebook: Cases and Commentary (revised 2nd ed.). Frederic G. Reamer 800
Beyond the sentence : discourse and sentential form / edited by Jessica R. Wirth 600
Holistic Discourse Analysis 600
Vertébrés continentaux du Crétacé supérieur de Provence (Sud-Est de la France) 600
Reliability Monitoring Program 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5339199
求助须知:如何正确求助?哪些是违规求助? 4476081
关于积分的说明 13930490
捐赠科研通 4371512
什么是DOI,文献DOI怎么找? 2401972
邀请新用户注册赠送积分活动 1394922
关于科研通互助平台的介绍 1366792