动态调整学习率

2019-08-06  本文已影响0人  poteman

学习率对模型的学习效果影响很大。

model.compile(loss = 'sparse_categorical_crossentropy', optimizer = Adam(lr = 0.001, decay=1e-6), metrics = ['accuracy'])
import numpy as np
from keras.callbacks import LearningRateScheduler

def step_decay_schedule(initial_lr=1e-3, decay_factor=0.75, step_size=10):
    '''
    Wrapper function to create a LearningRateScheduler with step decay schedule.
    '''
    def schedule(epoch):
        return initial_lr * (decay_factor ** np.floor(epoch/step_size))
    
    return LearningRateScheduler(schedule)

lr_sched = step_decay_schedule(initial_lr=1e-4, decay_factor=0.75, step_size=2)
model.fit(X_train, Y_train, callbacks=[lr_sched])

【参考资料】
1.step_decay_schedule.py

上一篇 下一篇

猜你喜欢

热点阅读