Skip to content

Optimal strategy to finetune #167

Answered by janosh
rryltsev asked this question in Q&A
Discussion options

You must be logged in to vote
  1. How many configurations should I use for finetuninng?

generally the more, the better but depending on your problem, even a single finetuning label can be enough

  1. How to set batch size and the number of epochs for a given number of configurations?

small batch size (e.g. 8) probably preferred for fine tuning. do as many epochs as the validation loss keeps dropping

3 Should I freeze some layers of CHGNet during finetune, or is it better to retrain all the weights?

generally the fewer fine-tuning labels you have, the less layers should be unfrozen. if you have >10^4, i would unfreeze the whole model. f you have 1, only unfreeze the final layer. in between, it's a guessing game.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by janosh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants