Randomized algorithms for large-scale dictionary learning

计算机科学 K-SVD公司 算法 奇异值分解 核(代数) 人工智能 基质(化学分析) 稀疏逼近 数学 组合数学 复合材料 材料科学
作者
Gang Wu,Jiali Yang
出处
期刊:Neural Networks [Elsevier BV]
卷期号:179: 106628-106628
标识
DOI:10.1016/j.neunet.2024.106628
摘要

Dictionary learning is an important sparse representation algorithm which has been widely used in machine learning and artificial intelligence. However, for massive data in the big data era, classical dictionary learning algorithms are computationally expensive and even can be infeasible. To overcome this difficulty, we propose new dictionary learning methods based on randomized algorithms. The contributions of this work are as follows. First, we find that dictionary matrix is often numerically low-rank. Based on this property, we apply randomized singular value decomposition (RSVD) to the dictionary matrix, and propose a randomized algorithm for linear dictionary learning. Compared with the classical K-SVD algorithm, an advantage is that one can update all the elements of the dictionary matrix simultaneously. Second, to the best of our knowledge, there are few theoretical results on why one can solve the involved matrix computation problems inexactly in dictionary learning. To fill-in this gap, we show the rationality of this randomized algorithm with inexact solving, from a matrix perturbation analysis point of view. Third, based on the numerically low-rank property and Nyström approximation of the kernel matrix, we propose a randomized kernel dictionary learning algorithm, and establish the distance between the exact solution and the computed solution, to show the effectiveness of the proposed randomized kernel dictionary learning algorithm. Fourth, we propose an efficient scheme for the testing stage in kernel dictionary learning. By using this strategy, there is no need to form nor store kernel matrices explicitly both in the training and the testing stages. Comprehensive numerical experiments are performed on some real-world data sets. Numerical results demonstrate the rationality of our strategies, and show that the proposed algorithms are much efficient than some state-of-the-art dictionary learning algorithms. The MATLAB codes of the proposed algorithms are publicly available from https://github.com/Jiali-yang/RALDL_RAKDL.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
JamesPei应助CHSLN采纳,获得10
1秒前
2秒前
温暖大侠完成签到,获得积分10
2秒前
3秒前
天真依玉发布了新的文献求助10
3秒前
科目三应助科研通管家采纳,获得10
3秒前
在水一方应助科研通管家采纳,获得10
3秒前
KK发布了新的文献求助10
3秒前
科研通AI2S应助科研通管家采纳,获得10
3秒前
浩二应助科研通管家采纳,获得10
3秒前
yznfly应助科研通管家采纳,获得20
3秒前
浩二应助科研通管家采纳,获得10
4秒前
4秒前
顾矜应助科研通管家采纳,获得10
4秒前
怡然远航应助科研通管家采纳,获得10
4秒前
加菲丰丰应助wxqz采纳,获得20
4秒前
科研通AI2S应助科研通管家采纳,获得10
4秒前
4秒前
sttt完成签到,获得积分10
4秒前
伍六七完成签到 ,获得积分10
10秒前
11秒前
12秒前
啊头完成签到,获得积分10
16秒前
17秒前
卷卷心发布了新的文献求助10
17秒前
完美世界应助须尽欢采纳,获得10
18秒前
小蛇完成签到,获得积分10
18秒前
hahahahaha发布了新的文献求助10
19秒前
21秒前
24秒前
cnspower完成签到,获得积分10
25秒前
李健的小迷弟应助烧心C采纳,获得10
25秒前
25秒前
张张孟孟发布了新的文献求助10
27秒前
王欣完成签到 ,获得积分10
28秒前
28秒前
mcw发布了新的文献求助10
30秒前
31秒前
须尽欢发布了新的文献求助10
31秒前
顾北发布了新的文献求助10
31秒前
高分求助中
Les Mantodea de Guyane: Insecta, Polyneoptera [The Mantids of French Guiana] 2500
Future Approaches to Electrochemical Sensing of Neurotransmitters 1000
Electron microscopy study of magnesium hydride (MgH2) for Hydrogen Storage 1000
Finite Groups: An Introduction 800
壮语核心名词的语言地图及解释 600
生物降解型栓塞微球市场(按产品类型、应用和最终用户)- 2030 年全球预测 500
Thermal Expansion of Solids (CINDAS Data Series on Material Properties, v. I-4) 470
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3906527
求助须知:如何正确求助?哪些是违规求助? 3452235
关于积分的说明 10868748
捐赠科研通 3177740
什么是DOI,文献DOI怎么找? 1755547
邀请新用户注册赠送积分活动 848878
科研通“疑难数据库(出版商)”最低求助积分说明 791323