8000 SequentialLR does not work correctly with multiple ConstantLR · Issue #82684 · pytorch/pytorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
SequentialLR does not work correctly with multiple ConstantLR #82684
Open
@pauldb89

Description

@pauldb89

In combination with multiple ConstantLR schedulers, SequentialLR should use one at a time (depending whether the current epoch and milestones), but it instead applies several at the same time. For the example below:

import torch.optim
from torch import nn
from torch.optim.lr_scheduler import ConstantLR
from torch.optim.lr_scheduler import SequentialLR

model = nn.Linear(10, 10)
optimizer = torch.optim.Adam(model.parameters(), lr=1.0)

scheduler1 = ConstantLR(optimizer, factor=0.5, total_iters=3)
scheduler2 = ConstantLR(optimizer, factor=0.3, total_iters=4)
scheduler = SequentialLR(optimizer, schedulers=[scheduler1, scheduler2], milestones=[3])

for step in range(6):
    scheduler.step()
    print(step, scheduler.get_last_lr())

The output is:
0 [0.15]
1 [0.15]
2 [0.3]
3 [0.3]
4 [0.3]
5 [0.3]

While the correct output should be:
0 [0.5]
1 [0.5]
2 [0.3]
3 [0.3]
4 [0.3]
5 [0.3]

Versions

1.12

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: LrSchedulertriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0