-
Notifications
You must be signed in to change notification settings - Fork 45.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TF 2.0 Transformer: Remove metrics hack #6961
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM see comment.
self.vocab_file = os.path.join(temp_dir, "vocab") | ||
self.vocab_size = misc.get_model_params(FLAGS.param_set, 0)["vocab_size"] | ||
self.bleu_source = os.path.join(temp_dir, "bleu_source") | ||
self.bleu_ref = os.path.join(temp_dir, "bleu_ref") | ||
|
||
def _assert_exists(self, filepath): | ||
self.assertTrue(os.path.exists(filepath)) | ||
|
||
def test_train(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#1 thank you for working on these tests.
Question, do we need this test if we are not checking anything? My assumption is we will catch exceptions via the other tests. Not remotely against the test just asking.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, these tests are basically verifying that we get to the end of train without running into any errors.
I don't know of any other tests that verify this for transformer, are there any?
The underlying issue has been fixed so we can now create and update metrics in cross replica context.
Also fixed existing unit tests and added more tests which test distribution strategy codepaths as well.