Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make MultiOptimizer serializable #2719

Merged

Conversation

JackWindows
Copy link
Contributor

@JackWindows JackWindows commented May 31, 2022

Description

The current implementation of MultiOptimizer puts grads and vars inside the optimizer_specs variable. Since the grads and vars are not serializable, any model that uses MultiOptimizer class cannot be saved or checkpointed. This PR will only extract the necessary optimizer_specs components that are useful for re-instantiating the MultiOptimizer instance inside get_config() method, leading to a serializable MultiOptimizer implementation.

Fixes # (issue)

Type of change

Checklist:

  • I've properly formatted my code according to the guidelines
    • By running Black + Flake8
    • By running pre-commit hooks
  • This PR addresses an already submitted issue for TensorFlow Addons
  • I have made corresponding changes to the documentation
  • I have added tests that prove my fix is effective or that my feature works
  • This PR contains modifications to C++ custom-ops

How Has This Been Tested?

If you're adding a bugfix or new feature please describe the tests that you ran to verify your changes:
*
I have tested model.save() method after this change. Prior to this change, a trained model (has run training iteration for at least 1 step) with MultiOptimizer cannot be saved because of the un-serializable grads and vars. With the fix, I can verify that model.save() method works.

@google-cla
Copy link

google-cla bot commented May 31, 2022

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

@bot-of-gabrieldemarmiesse

@hyang0129

You are owner of some files modified in this pull request.
Would you kindly review the changes whenever you have the time to?
Thank you very much.

The current implementation of MultiOptimizer puts grads and vars
inside the optimizer_specs variable. Since the grads and vars are
not serializable, any model that uses MultiOptimizer class cannot
be saved or checkpointed. This PR will only extract the necessary
optimizer_specs that are useful for re-instantiating the
MultiOptimizer instance in get_config() method, leading to a
serializable MultiOptimizer implementation.
@JackWindows JackWindows force-pushed the make_multioptimizer_serializable branch from 2529b5c to ba536e0 Compare May 31, 2022 15:14
@bhack
Copy link
Contributor

bhack commented Jun 24, 2022

Can you add a serialization test?

@JackWindows
Copy link
Contributor Author

Can you add a serialization test?

The serialization test has already been there (code ref). The problem is that the grads and vars are not there until the model has been trained for at least one iteration.
Therefore, the existing test doesn't catch the failure case. However, if we were to train the model for one iteration in the test, it's a bit far-fetched for a unit test.

Any suggestion on how to deal with this?

@bhack
Copy link
Contributor

bhack commented Jun 24, 2022

You could extend the test.
We have some other test cases where we train the model few steps. So you could use something similar.

@JackWindows
Copy link
Contributor Author

Yes you are right. Sorry I didn't read enough code to realize we can fit the model during testing.

I have added a test that check the serializability. And I have verified that without this PR the test won't pass.

@JackWindows
Copy link
Contributor Author

@bhack Is there any more thing I can do to help facilitate merging of this PR?

@bhack
Copy link
Contributor

bhack commented Jul 3, 2022

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants