Skip to main content

Implementation of Learning Rate Scheduler

Some papers have proposed that changing the learning rate during the training process can improve the performance of the model. Optimizers are not designed for this purpose, it only works on one learning rate. So we need to use a learning rate scheduler to change the learning rate during the training process.