-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When I was training, the loss value changed to nan #329
Comments
Is this caused by gradient explosion? Would it be better to use a small model like v9-s? |
Same here. |
I was trying between YOLOv8 and v9 and I found similar issues there
At least now the loss can be divided by 1. Conclusion: I can train both with single training. For YOLOv9 I can't train the ones with auxiliary branches because the loss boom out to infinity still isn't solved. |
When I was training, the initial evaluation indicators were all normal, but after 30 rounds, the loss value became nan, and the map and other indicators also became 0. May I ask what is causing this? I used YOLOv9-c
![53576c09030ed5dfab1b07cf2a36ca8](https://private-user-images.githubusercontent.com/113035146/320251114-196c7ca4-2417-46c8-87a3-b3a5bc68bf09.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjMwNjIyMzksIm5iZiI6MTcyMzA2MTkzOSwicGF0aCI6Ii8xMTMwMzUxNDYvMzIwMjUxMTE0LTE5NmM3Y2E0LTI0MTctNDZjOC04N2EzLWIzYTViYzY4YmYwOS5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwODA3JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDgwN1QyMDE4NTlaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT00N2IzMzQ5NTVkM2Y2MjFiZGJhY2NhZWM2YzEyN2JhMjczZDdmODAyYWYzZGZmZjkwNTYzYWNiMTVhZjE5ZjhiJlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.P4iH6THxqgeyEO0I9AGAbmd81V9qMnjnbgLDhTJ03Pw)
![b3c41ef26e4965ebeb644475c85e397](https://private-user-images.githubusercontent.com/113035146/320251122-767ab119-a099-440f-962a-d29b1979aed0.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjMwNjIyMzksIm5iZiI6MTcyMzA2MTkzOSwicGF0aCI6Ii8xMTMwMzUxNDYvMzIwMjUxMTIyLTc2N2FiMTE5LWEwOTktNDQwZi05NjJhLWQyOWIxOTc5YWVkMC5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwODA3JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDgwN1QyMDE4NTlaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT02NGE4ZDI4MjNhNDdmMDQ2MmM0ZmQ1ODEwYTc2N2I5NGNkNjc5OGFkYjNhNTg0MWY4M2M4ZDAwYWI4YzY1MjM2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.lCiUgeB1WJ-X7T4OIO8wdOGaQYrHpMscfY1-0G6XLFw)
The text was updated successfully, but these errors were encountered: