Open
Description
In combination with multiple ConstantLR schedulers, SequentialLR should use one at a time (depending whether the current epoch and milestones), but it instead applies several at the same time. For the example below:
import torch.optim
from torch import nn
from torch.optim.lr_scheduler import ConstantLR
from torch.optim.lr_scheduler import SequentialLR
model = nn.Linear(10, 10)
optimizer = torch.optim.Adam(model.parameters(), lr=1.0)
scheduler1 = ConstantLR(optimizer, factor=0.5, total_iters=3)
scheduler2 = ConstantLR(optimizer, factor=0.3, total_iters=4)
scheduler = SequentialLR(optimizer, schedulers=[scheduler1, scheduler2], milestones=[3])
for step in range(6):
scheduler.step()
print(step, scheduler.get_last_lr())
The output is:
0 [0.15]
1 [0.15]
2 [0.3]
3 [0.3]
4 [0.3]
5 [0.3]
While the correct output should be:
0 [0.5]
1 [0.5]
2 [0.3]
3 [0.3]
4 [0.3]
5 [0.3]
Versions
1.12