梯度下降
随机梯度下降算法
算法
计算机科学
功能(生物学)
梯度法
二进制数
数学优化
数学
人工智能
人工神经网络
进化生物学
生物
算术
作者
Xin Wang,Liting Yan,Qizhi Zhang
标识
DOI:10.1109/iccnea53019.2021.00014
摘要
The gradient descent algorithm is a type of optimization algorithm that is widely used to solve machine learning algorithm model parameters. Through continuous iteration, it obtains the gradient of the objective function, gradually approaches the optimal solution of the objective function, and finally obtains the minimum loss function and related parameters. The gradient descent algorithm is frequently used in the solution process of logical regression, which is a common binary classification approach. This paper compares and analyzes the differences between batch gradient descent and its derivative algorithms — stochastic gradient descent algorithm and mini- batch gradient descent algorithm in terms of iteration number, loss function through experiments, and provides some suggestions on how to pick the best algorithm for the logistic regression binary task in machine learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI