Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update reference scripts to use the "Batteries Included" utils #4281

Closed
4 tasks done
datumbox opened this issue Aug 17, 2021 · 0 comments · Fixed by #4335, #4411, #4444, #4379 or #4381
Closed
4 tasks done

Update reference scripts to use the "Batteries Included" utils #4281

datumbox opened this issue Aug 17, 2021 · 0 comments · Fixed by #4335, #4411, #4444, #4379 or #4381

Comments

@datumbox
Copy link
Contributor

datumbox commented Aug 17, 2021

🚀 Feature

As part of the "Batteries Included" initiative (#3911) we are adding a number of new utils that can be used to produce SOTA results. Once those utils are landed, we should update our reference scripts to use them.

More specifically we need to:

  • Update the reference scripts to use PyTorch's new warmup schedulers.
    • The reference scripts should be updated to let users define the warmup_method (if any), the warmup_iters and the warmup_factor. The warmup scheduler should be chained with other existing schedulers.
    • The Object Detection recipe needs to be BC compatible and replace our custom linear warmup approach with the one from PyTorch.
    • Similarly the Video Classification recipe needs also to be BC compatible and replace our customer scheduler.
    • All other recipes (Classification and Segmentation), should be updated to optionally use warm up. The addition should be BC compatible and turned off by default.
  • Update the Classification reference to use PyTorch's new label smoothing implementation.
  • Update the Classification reference to use Mixup and Cutmix.
    • After implementing the Mixup and Cutmix augmentations, update the reference script of classification to use them on training. Adding Mixup and Cutmix #4379
    • The new mixup-alpha and cutmix-alpha arguments should have 0.0 default values to ensure BC.
  • Update the Classification reference to use EMA.
    • Add Exponential Moving Average support on the Classification scripts by extending the torch.optim.swa_utils.AveragedModel util and ensure the EMA model is evaluated at the end of each epoch.
    • Use a feature switch model-ema param to ensure BC and ensure we have a good model-ema-decay default value.
@datumbox datumbox modified the milestone: v0.8.2 Aug 18, 2021
@datumbox datumbox reopened this Aug 26, 2021
@datumbox datumbox reopened this Aug 30, 2021
@datumbox datumbox linked a pull request Aug 31, 2021 that will close this issue
@datumbox datumbox reopened this Sep 2, 2021
@datumbox datumbox assigned iramazanli and datumbox and unassigned iramazanli Sep 4, 2021
facebook-github-bot pushed a commit to pytorch/pytorch that referenced this issue Sep 7, 2021
Summary:
Partially unblocks pytorch/vision#4281

Previously we have added WarmUp Schedulers to PyTorch Core in the PR : #60836 which had two mode of execution - linear and constant depending on warming up function.

In this PR we are changing this interface to more direct form, as separating linear and constant modes to separate Schedulers. In particular

```Python
scheduler1 = WarmUpLR(optimizer, warmup_factor=0.1, warmup_iters=5, warmup_method="constant")
scheduler2 = WarmUpLR(optimizer, warmup_factor=0.1, warmup_iters=5, warmup_method="linear")
```

will look like

```Python
scheduler1 = ConstantLR(optimizer, warmup_factor=0.1, warmup_iters=5)
scheduler2 = LinearLR(optimizer, warmup_factor=0.1, warmup_iters=5)
```

correspondingly.

Pull Request resolved: #64395

Reviewed By: datumbox

Differential Revision: D30753688

Pulled By: iramazanli

fbshipit-source-id: e47f86d12033f80982ddf1faf5b46873adb4f324
@datumbox datumbox reopened this Sep 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment