Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to specify the learning rate decay strategy during training? How to achieve it? #5715

Closed
1 task done
Wanghe1997 opened this issue Nov 19, 2021 · 7 comments
Closed
1 task done
Labels
question Further information is requested Stale

Comments

@Wanghe1997
Copy link

Search before asking

Question

For YOLOv5-v6.0
Such as StepLR,ExpLR,MultiStepLR,CosineLR and so on. In YOLOv5-v6.0,which learning rate decay strategy is used by default? If I want to change, which lines of code in which file should I change? thanks!

Additional

No response

@Wanghe1997 Wanghe1997 added the question Further information is requested label Nov 19, 2021
@glenn-jocher
Copy link
Member

@Wanghe1997 YOLOv5 uses cosine LR scheduler by default:

yolov5/train.py

Lines 167 to 172 in 8df64a9

# Scheduler
if opt.linear_lr:
lf = lambda x: (1 - x / (epochs - 1)) * (1.0 - hyp['lrf']) + hyp['lrf'] # linear
else:
lf = one_cycle(1, hyp['lrf'], epochs) # cosine 1->hyp['lrf']
scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=lf) # plot_lr_scheduler(optimizer, scheduler, epochs)

LR Curves

@Wanghe1997
Copy link
Author

@Wanghe1997 YOLOv5 uses cosine LR scheduler by default:

yolov5/train.py

Lines 167 to 172 in 8df64a9

# Scheduler
if opt.linear_lr:
lf = lambda x: (1 - x / (epochs - 1)) * (1.0 - hyp['lrf']) + hyp['lrf'] # linear
else:
lf = one_cycle(1, hyp['lrf'], epochs) # cosine 1->hyp['lrf']
scheduler = lr_scheduler.LambdaLR(optimizer, lr_lambda=lf) # plot_lr_scheduler(optimizer, scheduler, epochs)

LR Curves

In addition to cosine LR scheduler, are there other strategies written in the program?

@glenn-jocher
Copy link
Member

@Wanghe1997 linear is written in with python train.py --linear, but you can also use any other custom scheduler you want with a lambda function or any premade schedulers here:
https://pytorch.org/docs/stable/optim.html

Screenshot 2021-11-19 at 13 49 45

@Wanghe1997
Copy link
Author

@Wanghe1997 linear is written in with python train.py --linear, but you can also use any other custom scheduler you want with a lambda function or any premade schedulers here: https://pytorch.org/docs/stable/optim.html

Screenshot 2021-11-19 at 13 49 45

Thank you for your reply. Can you answer my supplementary question in another issue?
#5653 (comment)

@github-actions
Copy link
Contributor

github-actions bot commented Dec 20, 2021

👋 Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs.

Access additional YOLOv5 🚀 resources:

Access additional Ultralytics ⚡ resources:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLOv5 🚀 and Vision AI ⭐!

@adityatandon
Copy link

For anyone looking for an updated answer in 2022, from YOLOv5-version 6.1 onwards, the default learning rate scheduler has been changed to linear instead of the cosine lr scheduler.

@glenn-jocher
Copy link
Member

@adityatandon thank you for the update! The YOLOv5 team is constantly improving the framework based on the latest research and user feedback. If you have any further questions or need assistance with anything else, feel free to ask here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

3 participants