聚类分析
吉布斯抽样
数学
贝叶斯定理
数据挖掘
相关聚类
计算机科学
高维数据聚类
贝叶斯概率
机器学习
人工智能
作者
Tommaso Rigon,Amy H. Herring,David B. Dunson
出处
期刊:Biometrika
[Oxford University Press]
日期:2023-01-19
卷期号:110 (3): 559-578
被引量:6
标识
DOI:10.1093/biomet/asad004
摘要
Summary Loss-based clustering methods, such as k-means clustering and its variants, are standard tools for finding groups in data. However, the lack of quantification of uncertainty in the estimated clusters is a disadvantage. Model-based clustering based on mixture models provides an alternative approach, but such methods face computational problems and are highly sensitive to the choice of kernel. In this article we propose a generalized Bayes framework that bridges between these paradigms through the use of Gibbs posteriors. In conducting Bayesian updating, the loglikelihood is replaced by a loss function for clustering, leading to a rich family of clustering methods. The Gibbs posterior represents a coherent updating of Bayesian beliefs without needing to specify a likelihood for the data, and can be used for characterizing uncertainty in clustering. We consider losses based on Bregman divergence and pairwise similarities, and develop efficient deterministic algorithms for point estimation along with sampling algorithms for uncertainty quantification. Several existing clustering algorithms, including k-means, can be interpreted as generalized Bayes estimators in our framework, and thus we provide a method of uncertainty quantification for these approaches, allowing, for example, calculation of the probability that a data point is well clustered.
科研通智能强力驱动
Strongly Powered by AbleSci AI