Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When training, some strange phenomena appear after the learning rate decays. #57

Open
FightStone opened this issue Aug 25, 2020 · 0 comments

Comments

@FightStone
Copy link

very outstanding job. when i training centermask2 on the coco datasets(only person class not 80 classes), everything(include map and loss) looks good in the first stage of training. when learning rate decays by 0.1, loss values significantly reduced, but map reduced. Do you know the reason for this? Looking forward to your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant