偏最小二乘回归
多元统计
人工智能
计算机科学
不变(物理)
回归
模式识别(心理学)
领域(数学分析)
数学
统计
机器学习
数学分析
数学物理
作者
Bianca Mikulasek,Valeria Fonseca Díaz,David Gabauer,Christoph Herwig,Ramin Nikzad‐Langerodi
摘要
Abstract This paper introduces the multiple domain‐invariant partial least squares (mdi‐PLS) method, which generalizes the recently introduced domain‐invariant partial least squares method (di‐PLS). In contrast to di‐PLS which solely allows transferring of knowledge from a single source to a single target domain, the proposed approach enables the incorporation of data from an arbitrary number of domains. Additionally, mdi‐PLS offers a high level of flexibility by accepting labeled (supervised) and unlabeled (unsupervised) data to cope with dataset shifts. We demonstrate the application of the mdi‐PLS method on a simulated and one real‐world dataset. Our results show a clear outperformance of both PLS and di‐PLS when data from multiple related domains are available for training multivariate calibration models underpinning the benefit of mdi‐PLS.
科研通智能强力驱动
Strongly Powered by AbleSci AI