计算机科学
排队论
数学优化
梯度下降
推论
后悔
趋同(经济学)
随机梯度下降算法
可验证秘密共享
可扩展性
统计推断
排队
随机优化
分歧(语言学)
分层排队网络
收敛速度
随机过程
极限(数学)
马尔可夫决策过程
随机逼近
最优化问题
动力系统理论
最优控制
随机控制
中心极限定理
搭配(遥感)
梯度法
应用数学
马尔可夫过程
联轴节(管道)
估计员
不确定度量化
作者
X. Z. Li,Jiadong Liang,Xinyun Chen,Zhihua Zhang
出处
期刊:Operations Research
[Institute for Operations Research and the Management Sciences]
日期:2026-01-28
标识
DOI:10.1287/opre.2025.1662
摘要
Stream SGD: Fast Learning and Valid Inference from Dependent Data Many online optimization problems in operations research rely on data generated by Markovian systems whose dynamics depend on the decision parameters, creating both statistical dependence and biased gradient information. This paper, “Convergence and Inference of Stream Stochastic Gradient Descent, with Applications to Queueing Systems and Inventory Control,” develops a unified theory for stream stochastic gradient descent (SGD), a sample-efficient method that uses just one observation per iteration. Using Poisson-equation techniques, the authors quantify and control gradient bias and dependence, proving an optimal [Formula: see text] convergence rate and a state-of-the-art O(log T) regret bound. Beyond optimization performance, the paper introduces an online inference framework for uncertainty quantification and establishes a functional central limit theorem that underpins valid asymptotic inference. A new Wasserstein-type divergence yields verifiable conditions via coupling arguments tailored to operations research models. Applications to queueing and inventory problems demonstrate how the theory translates into practical, scalable algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI