Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyperparameter tuning #19

Open
goutamyg opened this issue May 17, 2023 · 0 comments
Open

Hyperparameter tuning #19

goutamyg opened this issue May 17, 2023 · 0 comments

Comments

@goutamyg
Copy link

Hi! Thank you for publishing your code.

Your released code has a section dedicated to configuration files corresponding to different tracker modules https://github.com/PinataFarms/FEARTracker/tree/main/model_training/config

It has parameters/choices related to training and inference (optimizer, learning rate scheduler, penalty_k, window influence, lr to name a few). Can you please suggest which dataset was used to tune these hyperparameter values? Was it fine-tuned using the test-set itself?

Also, I am particularly intrigued by a statement in the paper: "For each epoch, we randomly sample 20,000 images from LaSOT, 120,000 from COCO, 400,000 from YoutubeBB, 320,000 from GOT10k and 310,000 images from the ImageNet dataset". Can you suggest what was the reason behind choosing such a sampling split and not going for uniform sampling?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant