计算机科学
元学习(计算机科学)
人工智能
推论
人工神经网络
校准
一般化
机器学习
适应(眼睛)
任务(项目管理)
深度学习
路径(计算)
数学
经济
程序设计语言
管理
数学分析
物理
光学
统计
作者
Peng Yang,Shaogang Ren,Yang Zhao,Ping Li
标识
DOI:10.1109/wacv51458.2022.00048
摘要
Although few-shot meta learning has been extensively studied in machine learning community, the fast adaptation towards new tasks remains a challenge in the few-shot learning scenario. The neuroscience research reveals that the capability of evolving neural network formulation is essential for task adaptation, which has been broadly studied in recent meta-learning researches. In this paper, we present a novel forward-backward meta-learning framework (FBM) to facilitate the model generalization in few-shot learning from a new perspective, i.e., neuron calibration. In particular, FBM models the neurons in deep neural network-based model as calibrated units under a general formulation, where neuron calibration could empower fast adaptation capability to the neural network-based models through influencing both their forward inference path and backward propagation path. The proposed calibration scheme is lightweight and applicable to various feed-forward neural network architectures. Extensive empirical experiments on the challenging few-shot learning benchmarks validate that our approach training with neuron calibration achieves a promising performance, which demonstrates that neuron calibration plays a vital role in improving the few-shot learning performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI