PyTorch - How to get learning rate during training?

24,471

Solution 1

For only one parameter group like in the example you've given, you can use this function and call it during training to get the current learning rate:

def get_lr(optimizer):
    for param_group in optimizer.param_groups:
        return param_group['lr']

Solution 2

Alternatively, you may use an lr_scheduler along with your optimizer and simply call the built-in lr_scheduler.get_lr() method.

Here is an example:

my_optimizer = torch.optim.Adam( my_model.parameters(), 
                                 lr = 0.001, 
                                 weight_decay = 0.002)

my_lr_scheduler = torch.optim.lr_scheduler.StepLR( my_optimizer, 
                                                step_size = 50, 
                                                gamma = 0.1)

# train
...
my_optimizer.step()
my_lr_scheduler.step()

# get learning rate
my_lr = my_lr_scheduler.get_lr()
# or
my_lr = my_lr_scheduler.optimizer.param_groups[0]['lr']

The added benefit for using lr_scheduler is more controls on changing lr over time; lr_decay, etc. For lr_scheduler args, refer to pytorch docs.

Share:
24,471
Admin
Author by

Admin

Updated on July 09, 2022

Comments

  • Admin
    Admin almost 2 years

    While training, I'd like to know the value of learning_rate. What should I do?

    It's my code, like this:

    my_optimizer = torch.optim.SGD(my_model.parameters(), 
                                   lr=0.001, 
                                   momentum=0.99, 
                                   weight_decay=2e-3)
    

    Thank you.