机器学习 梯度下降
2023-08-29 本文已影响0人
飞猪的浪漫
总结:
Batch gradient descent:Use all examples in each iteration;
Stochastic gradient descent:Use 1 example in each iteration;
Mini-batch gradient descent:Use b examples in each iteration.
总结:
Batch gradient descent:Use all examples in each iteration;
Stochastic gradient descent:Use 1 example in each iteration;
Mini-batch gradient descent:Use b examples in each iteration.