判别式
计算机科学
互补性(分子生物学)
特征选择
k-最近邻算法
人工智能
成对比较
模式识别(心理学)
特征(语言学)
机器学习
数据挖掘
语言学
遗传学
生物
哲学
作者
Xuemeng Jiang,Jun Wang,Jinmao Wei,Jianhua Ruan,Gang Yu
标识
DOI:10.1007/978-3-319-97304-3_59
摘要
Feature selection is crucial for dimension reduction. Dozens of approaches employ the area under ROC curve, i.e., AUC, to evaluate features, and have shown their attractiveness in finding discriminative targets. However, feature complementarity for jointly discriminating classes is generally improperly handled by these approaches. In a recent approach to deal with such issues, feature complementarity was evaluated by computing the difference between the neighbors of each instance in different feature dimensions. This local-learning based approach introduces a distinctive way to determine how a feature is complementarily discriminative given another. Nevertheless, neighbor information is usually sensitive to noises. Furthermore, evaluating merely one-side information of nearest misses will definitely neglect the impacts of nearest hits on feature complementarity. In this paper, we propose to integrate all-side local-learning based complementarity into an AUC-based approach, dubbed ANNC, to evaluate pairwise features by scrutinizing their comprehensive misclassification information in terms of both k-nearest misses and k-nearest hits. This strategy contributes to capture complementary features that collaborate with each other to achieve remarkable recognition performance. Extensive experiments on openly available benchmarks demonstrate the effectiveness of the new approach under various metrics.
科研通智能强力驱动
Strongly Powered by AbleSci AI