神经网络中的Dropout

2019-01-08  本文已影响0人  哎吆喂轩

Dropout 的理解

Dropout 的主要思想

代码实现

from tensorflow.contrib.layers import dropout
[......]
is_training = tf.placeholder(tf.bool, shape = (), name = 'is_training')

keep_prob = 0.5
X_drop = dropout(X, keep_prob, is_training = is_training)

hidden1 = fully_connected(X_drop, n_hidden1, scope = "hidden1")
hidden1_drop = dropout(hidden1, keep_prob, is_training = is_training)
hidden2 = fully_connected(X_drop, n_hidden2, scope = "hidden2")
hidden2_drop = dropout(hidden2, keep_prob, is_training = is_training)

logits = fully_connected(hidden2_drop, n_outputs, activation_fn = None, scope = "outputs")
  • 其中,keep_prob 是保留下来的比例,1- keep_prob 是dropout_rate
  • 如果在训练的时候,将is_training 设置为True,则在测试的时候,需要将is_training设置为False
上一篇 下一篇

猜你喜欢

热点阅读