Softmax函数
机器学习
计算机科学
人工智能
多任务学习
推论
先验概率
任务(项目管理)
杠杆(统计)
耿贝尔分布
水准点(测量)
概率逻辑
贝叶斯推理
贝叶斯概率
深度学习
数学
统计
管理
极值理论
大地测量学
经济
地理
作者
Jiayi Shen,Xiantong Zhen,Marcel Worring,Ling Shao
出处
期刊:Cornell University - arXiv
日期:2021-01-01
被引量:9
标识
DOI:10.48550/arxiv.2111.05323
摘要
Multi-task learning aims to explore task relatedness to improve individual tasks, which is of particular significance in the challenging scenario that only limited data is available for each task. To tackle this challenge, we propose variational multi-task learning (VMTL), a general probabilistic inference framework for learning multiple related tasks. We cast multi-task learning as a variational Bayesian inference problem, in which task relatedness is explored in a unified manner by specifying priors. To incorporate shared knowledge into each task, we design the prior of a task to be a learnable mixture of the variational posteriors of other related tasks, which is learned by the Gumbel-Softmax technique. In contrast to previous methods, our VMTL can exploit task relatedness for both representations and classifiers in a principled way by jointly inferring their posteriors. This enables individual tasks to fully leverage inductive biases provided by related tasks, therefore improving the overall performance of all tasks. Experimental results demonstrate that the proposed VMTL is able to effectively tackle a variety of challenging multi-task learning settings with limited training data for both classification and regression. Our method consistently surpasses previous methods, including strong Bayesian approaches, and achieves state-of-the-art performance on five benchmark datasets.
科研通智能强力驱动
Strongly Powered by AbleSci AI