Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading pre-trained model is not supported for num_classes != 80 #475

Open
chandan-wiai opened this issue Dec 28, 2022 · 1 comment
Open
Labels
enhancement New feature or request

Comments

@chandan-wiai
Copy link

https://github.com/zhiqwang/yolov5-rt-stack/blob/b7cb695beacec273ea97cc0e3732797580ef37b5/yolort/models/yolo.py#L263

Starting from a pre-trained model on a custom dataset would help in faster convergence and better model performance. But currently when we try to use a pretrained model with num_classes other than 80, it fails and we have train the model from scratch instead.
One possible solution of this could be keeping strict=False while loading state dictionary in line 263.
model.load_state_dict(state_dict, strict=False)

Can this be implemented?

@zhiqwang
Copy link
Owner

Hi @chandan-wiai ,

Starting from a pre-trained model on a custom dataset would help in faster convergence and better model performance.

Yep, This would be a very useful feature, but unfortunately the training mechanism is not fully developed yet. See #59 and #60 for more details. Maybe we need to do more preparation, the timing is not very clear to me at the moment, my focus is now moving to some other projects.

@zhiqwang zhiqwang added the enhancement New feature or request label Dec 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants