TensorFlow操作

TensorFlow 梯度截断

2018-07-28  本文已影响0人  翻开日记
global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(config.base_learning_rate, global_step,
                                           decay_steps=config.decay_steps,
                                           decay_rate=config.decay_rate,
                                           staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate) ##.minimize(loss_val, global_step)
gradients = optimizer.compute_gradients(loss_val)
capped_gradients = [(tf.clip_by_value(grad, -5., 5.), var) for grad, var in gradients if grad is not None]
train_op = optimizer.apply_gradients(capped_gradients, global_step)
上一篇 下一篇

猜你喜欢

热点阅读