Quantized Training of Gradient Boosting Decision Trees

计算机科学 加速 计算 Boosting(机器学习) 梯度升压 简单(哲学) 交替决策树 人工智能 决策树 算法 培训(气象学) 机器学习 理论计算机科学 并行计算 决策树学习 随机森林 认识论 气象学 哲学 物理 增量决策树
作者
Shuang Yu,Guolin Ke,Zhuoming Chen,Shuxin Zheng,Tie-Yan Liu
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2207.09682
摘要

Recent years have witnessed significant success in Gradient Boosting Decision Trees (GBDT) for a wide range of machine learning applications. Generally, a consensus about GBDT's training algorithms is gradients and statistics are computed based on high-precision floating points. In this paper, we investigate an essentially important question which has been largely ignored by the previous literature: how many bits are needed for representing gradients in training GBDT? To solve this mystery, we propose to quantize all the high-precision gradients in a very simple yet effective way in the GBDT's training algorithm. Surprisingly, both our theoretical analysis and empirical studies show that the necessary precisions of gradients without hurting any performance can be quite low, e.g., 2 or 3 bits. With low-precision gradients, most arithmetic operations in GBDT training can be replaced by integer operations of 8, 16, or 32 bits. Promisingly, these findings may pave the way for much more efficient training of GBDT from several aspects: (1) speeding up the computation of gradient statistics in histograms; (2) compressing the communication cost of high-precision statistical information during distributed training; (3) the inspiration of utilization and development of hardware architectures which well support low-precision computation for GBDT training. Benchmarked on CPUs, GPUs, and distributed clusters, we observe up to 2$\times$ speedup of our simple quantization strategy compared with SOTA GBDT systems on extensive datasets, demonstrating the effectiveness and potential of the low-precision training of GBDT. The code will be released to the official repository of LightGBM.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
2秒前
Carmen发布了新的文献求助10
4秒前
搜集达人应助wlnhyF采纳,获得10
7秒前
小谢发布了新的文献求助10
7秒前
9秒前
12秒前
12秒前
赘婿应助科研通管家采纳,获得50
12秒前
小尹同学应助科研通管家采纳,获得20
12秒前
bkagyin应助科研通管家采纳,获得10
12秒前
Hello应助科研通管家采纳,获得10
12秒前
凤凰应助科研通管家采纳,获得30
12秒前
phuocnlh应助科研通管家采纳,获得10
12秒前
lzc完成签到 ,获得积分10
13秒前
小虎发布了新的文献求助10
15秒前
finye发布了新的文献求助10
19秒前
外星短暂访问飞行器完成签到,获得积分10
23秒前
23秒前
赵李艺完成签到 ,获得积分10
26秒前
咎冬亦发布了新的文献求助10
26秒前
依依完成签到 ,获得积分10
26秒前
29秒前
30秒前
FashionBoy应助nunornor采纳,获得10
32秒前
33秒前
郑毅完成签到 ,获得积分10
35秒前
坚强依波完成签到,获得积分20
36秒前
36秒前
朱荧荧发布了新的文献求助10
37秒前
37秒前
坚强依波发布了新的文献求助10
39秒前
40秒前
MESSI10发布了新的文献求助10
41秒前
43秒前
45秒前
西西发布了新的文献求助10
47秒前
48秒前
akber123发布了新的文献求助10
49秒前
我爱康康文献完成签到 ,获得积分10
49秒前
高分求助中
The three stars each: the Astrolabes and related texts 1100
Sport in der Antike 800
Berns Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
Stephen R. Mackinnon - Chen Hansheng: China’s Last Romantic Revolutionary (2023) 500
Sport in der Antike Hardcover – March 1, 2015 500
Psychological Warfare Operations at Lower Echelons in the Eighth Army, July 1952 – July 1953 400
宋、元、明、清时期“把/将”字句研究 300
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2432494
求助须知:如何正确求助?哪些是违规求助? 2115188
关于积分的说明 5364963
捐赠科研通 1843183
什么是DOI,文献DOI怎么找? 917257
版权声明 561559
科研通“疑难数据库(出版商)”最低求助积分说明 490692