Tensorflow:Adam Optimizer使用
2019-10-23 本文已影响0人
LoveSkye
def __init__(self, learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-8,
use_locking=False, name="Adam"):
公式原型
Args:
- learning_rate: A Tensor or a floating point value. The learning rate.
学习率 - beta1: A float value or a constant float tensor. The exponential decay rate for the 1st moment estimates.
指数衰减率的一阶矩估计 - beta2: A float value or a constant float tensor. The exponential decay rate for the 2nd moment estimates.
指数衰减率的二阶矩估计 - epsilon: A small constant for numerical stability.
防止分母除以0
论文截图:
算法细节
进一步学习可以看文章:
你真的懂Adam吗?