非负矩阵分解
分歧(语言学)
矩阵分解
数学
Kullback-Leibler散度
模式识别(心理学)
基质(化学分析)
转化(遗传学)
聚类分析
特征(语言学)
非线性系统
人工智能
算法
应用数学
计算机科学
哲学
物理
基因
特征向量
复合材料
量子力学
化学
材料科学
生物化学
语言学
作者
Lirui Hu,Ning Wu,Xiao Li
标识
DOI:10.1016/j.patcog.2022.108906
摘要
• A new Non-negative Matrix Factorization decomposition model is proposed. • A new Non-negative Matrix Factorization method, called Feature Nonlinear Transformation Non-negative Matrix Factorization with Kullback-Leibler Divergence (FNTNMF-KLD) is proposed. • New iterative update rules for basis matrix and feature matrix are derived strictly. • A proof of algorithm convergence is provided. • There are higher accuracies in object recognition and clustering. This paper introduces a Feature Nonlinear Transformation Non-negative Matrix Factorization with Kullback-Leibler Divergence (FNTNMF-KLD) for extracting the nonlinear features of a matrix in standard NMF. This method uses a nonlinear transformation to act on the feature matrix for constructing a NMF model based on the objective function of Kullback-Leibler Divergence, and the Taylor series expansion and the Newton iteration formula of solving root are used to obtain the iterative update rules of the basis matrix and the feature matrix. Experimental results show that the proposed method obtains the nonlinear features of data matrix in a more efficient way. In object recognition and clustering tasks, better accuracy can be achieved over some typical NMF methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI