Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

添加checkpoint warm_start_from 配置 #21

Open
shfshf opened this issue Mar 2, 2020 · 2 comments
Open

添加checkpoint warm_start_from 配置 #21

shfshf opened this issue Mar 2, 2020 · 2 comments

Comments

@shfshf
Copy link
Contributor

shfshf commented Mar 2, 2020

seq2annotation.trainer.train_model.py中添加warm_start_from=config.get("warm_start_dir", None):
estimator = tf.estimator.Estimator(
model_fn, instance_model_dir, cfg, estimator_params,
warm_start_from=config.get("warm_start_dir", None)
)
在configure.yaml配置中添加:warm_start_dir:

经过非专业测试,已无问题

@xiaomihao
Copy link

shf could you please tell us what the so_called warm_start_dir is meaning. Why should add this param to the func. What's the point please.

@shfshf
Copy link
Contributor Author

shfshf commented Mar 2, 2020

warm_start_from:
Optional, string, checkpoint file path, used to indicate where to start the warm boot. Or the tf.estimator.WarmStartSettings class to configure all hot starts. If it is a serial path, all variables are hot-started, and the names of the Tensor and vocabulary are not changed.
why we need this param: Change a small amount of corpus later without restarting training

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants