Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss is not decrease #890

Open
Lyhour-Chhay opened this issue Apr 17, 2022 · 0 comments
Open

Loss is not decrease #890

Lyhour-Chhay opened this issue Apr 17, 2022 · 0 comments

Comments

@Lyhour-Chhay
Copy link

I have been train the model my custom dataset. I have been trained up to 50 epoch. The training loss does not decrease around 1.6. The training is going well without NaN loss value. I already play around with hyper parameter such as learning rate vary from 0.01 to 0.0001 using Adam optimizer and SGD. I don't know what should I check more to make the loss decrease. Thank you very much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant