GD:特征放缩Feature Scaling&归一化Mean N

2017-12-25  本文已影响0人  jianshuqwerty

如果我们输入的样本的特征值都大致在相同的取值范围内会加快梯度下降的速度。这是因为θ在小范围内下降快。We can speed up gradient descent by having each of our input values in roughly the same range. This is because θ will descend quickly on small ranges and slowly on large ranges, and so will oscillate inefficiently down to the optimum when the variables are very uneven.

Feature Scaling: input value/(max-min)

Mean Normalization(归一化):

s可以是range也可以是均方差
上一篇 下一篇

猜你喜欢

热点阅读