计算机科学
杠杆(统计)
推论
高斯过程
机器学习
贝叶斯推理
人工智能
多任务学习
领域(数学分析)
归纳偏置
贝叶斯概率
维数之咒
高斯分布
数学
任务(项目管理)
数学分析
物理
管理
量子力学
经济
作者
Haitao Liu,Kai Wu,Yew-Soon Ong,Chao Bian,Xiaomo Jiang,Xiaofang Wang
标识
DOI:10.1109/tsmc.2023.3281973
摘要
Multitask Gaussian process (MTGP) is a well-known nonparametric Bayesian model for learning correlated tasks effectively by transferring knowledge across tasks. But current MTGPs are usually limited to the multitask scenario defined in the same input domain, leaving no space for tackling the heterogeneous case, i.e., the features of input domains vary over tasks. To this end, this article presents a novel heterogeneous stochastic variational linear model of coregionalization ( HSVLMC ) model for simultaneously learning the tasks with varied input domains. Particularly, we develop the stochastic variational framework with Bayesian calibration that: 1) infers posterior domain mappings to consider the effect of dimensionality reduction raised by domain mappings for achieving effective input alignment and 2) employs a residual modeling strategy to leverage the inductive bias brought by prior domain mappings for better-model inference. Finally, the superiority of the proposed model against existing heterogeneous LMC models has been extensively verified on diverse heterogeneous multitask cases and a practical multifidelity steam turbine exhaust case.
科研通智能强力驱动
Strongly Powered by AbleSci AI