An Adaptive Deep Metric Learning Loss Function for Class-Imbalance Learning via Intraclass Diversity and Interclass Distillation

人工智能 公制(单位) 机器学习 计算机科学 功能(生物学) 班级(哲学) 一般化 数学 工程类 运营管理 进化生物学 生物 数学分析
作者
Jie Du,Xiaoci Zhang,Peng Liu,Chi‐Man Vong,Tianfu Wang
出处
期刊:IEEE transactions on neural networks and learning systems [Institute of Electrical and Electronics Engineers]
卷期号:: 1-15 被引量:1
标识
DOI:10.1109/tnnls.2023.3286484
摘要

Deep metric learning (DML) has been widely applied in various tasks (e.g., medical diagnosis and face recognition) due to the effective extraction of discriminant features via reducing data overlapping . However, in practice, these tasks also easily suffer from two class-imbalance learning (CIL) problems: data scarcity and data density , causing misclassification. Existing DML losses rarely consider these two issues, while CIL losses cannot reduce data overlapping and data density. In fact, it is a great challenge for a loss function to mitigate the impact of these three issues simultaneously, which is the objective of our proposed intraclass diversity and interclass distillation (IDID) loss with adaptive weight in this article. IDID-loss generates diverse features within classes regardless of the class sample size (to alleviate the issues of data scarcity and data density) and simultaneously preserves the semantic correlations between classes using learnable similarity when pushing different classes away from each other (to reduce overlapping). In summary, our IDID-loss provides three advantages: 1) it can simultaneously mitigate all the three issues while DML and CIL losses cannot; 2) it generates more diverse and discriminant feature representations with higher generalization ability, compared with DML losses; and 3) it provides a larger improvement on the classes of data scarcity and density with a smaller sacrifice on easy class accuracy, compared with CIL losses. Experimental results on seven public real-world datasets show that our IDID-loss achieves the best performances in terms of G-mean, F1-score, and accuracy when compared with both state-of-the-art (SOTA) DML and CIL losses. In addition, it gets rid of the time-consuming fine-tuning process over the hyperparameters of loss function.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
bkagyin应助科研通管家采纳,获得10
刚刚
2秒前
钮冷荷发布了新的文献求助10
2秒前
4秒前
鲤鱼冬灵完成签到,获得积分20
4秒前
ding应助长期采纳,获得10
5秒前
5秒前
6秒前
非也非也6发布了新的文献求助10
7秒前
8秒前
8秒前
可爱的函函应助小玲仔采纳,获得10
8秒前
8秒前
漂亮的天抒完成签到,获得积分10
8秒前
郑万恶完成签到 ,获得积分10
8秒前
花芷发布了新的文献求助10
10秒前
瘦瘦以亦发布了新的文献求助10
11秒前
开放飞阳完成签到 ,获得积分10
11秒前
11秒前
丝梦发布了新的文献求助10
11秒前
12秒前
Akim应助new_vision采纳,获得10
13秒前
kiraify发布了新的文献求助10
13秒前
无花果应助榛苓采纳,获得10
16秒前
瘦瘦以亦完成签到,获得积分10
16秒前
平常的凡波完成签到,获得积分10
16秒前
小马甲应助cona采纳,获得10
18秒前
初晴应助Dobby采纳,获得20
19秒前
Jasper应助ZXJ1009采纳,获得10
20秒前
20秒前
20秒前
bkagyin应助breeze采纳,获得10
22秒前
23秒前
25秒前
张津浩发布了新的文献求助10
25秒前
25秒前
new_vision发布了新的文献求助10
27秒前
Lycux完成签到,获得积分10
28秒前
30秒前
高兴的风华完成签到 ,获得积分10
31秒前
高分求助中
The three stars each : the Astrolabes and related texts 1070
Manual of Clinical Microbiology, 4 Volume Set (ASM Books) 13th Edition 1000
Sport in der Antike 800
De arte gymnastica. The art of gymnastics 600
少脉山油柑叶的化学成分研究 530
Sport in der Antike Hardcover – March 1, 2015 500
Boris Pesce - Gli impiegati della Fiat dal 1955 al 1999 un percorso nella memoria 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2404976
求助须知:如何正确求助?哪些是违规求助? 2103395
关于积分的说明 5308474
捐赠科研通 1830783
什么是DOI,文献DOI怎么找? 912241
版权声明 560572
科研通“疑难数据库(出版商)”最低求助积分说明 487712