计算机科学
瓶颈
大数据
水准点(测量)
趋同(经济学)
方差减少
差异(会计)
算法
分布式算法
人工智能
数据挖掘
分布式计算
业务
嵌入式系统
经济
会计
地理
经济增长
大地测量学
作者
Changsheng Wu,Huihui Wang
标识
DOI:10.1109/dasc/picom/cbdcom/cy55231.2022.9927777
摘要
With the development of mobile Internet technology, various applications generate a huge amount of data in Cyber-Physical-Social systems. The exponential growth of data brings great difficulties to big data classification, especially time efficiency. The Alternating Direction Method of Multipliers (ADMM) is widely used for distributed machine learning tasks. However, it usually suffers from a slow convergence speed, and thus communication still is a significant bottleneck of distributed algorithms. To this end, in this paper, we pay attention to subproblem optimization in distributed algorithms, and propose a novel Distributed Accelerated Stochastic Variance Reduced Gradient algorithm (DAcSVRG+) for big data classification. Specially, we study the alternating direction method of multipliers for distributed learning framework, and transform the global classification problem into several small subproblems which can be solved in parallel. For the subproblem optimization, we adopt a variance reduction algorithm with Nesterov acceleration strategy, accelerated stochastic variance reduced gradient algorithm, to solve subproblems, and thus further improve the time efficiency. The experimental results on four public and benchmark datasets show that our proposed distributed algorithm can converge faster and achieve the competitive accuracy performance compared with other distributed classification methods with variance reduction.
科研通智能强力驱动
Strongly Powered by AbleSci AI