Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embed more metadata in exported model and read it in native client #2007

Merged
merged 1 commit into from
Apr 5, 2019

Conversation

reuben
Copy link
Contributor

@reuben reuben commented Apr 5, 2019

No description provided.

@reuben reuben requested a review from lissyx April 5, 2019 14:00
@@ -724,6 +724,17 @@ def do_graph_freeze(output_file=None, output_node_names=None, variables_blacklis
if not FLAGS.export_tflite:
frozen_graph = do_graph_freeze(output_node_names=output_names, variables_blacklist='previous_state_c,previous_state_h')
frozen_graph.version = int(file_relative_read('GRAPH_VERSION').strip())

# Add a no-op node to the graph with metadata information to be loaded by the native client
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And we don't break with TF Lite? nice :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope. We don't get the info in the TFLite graph either, unfortunately. I'll try to find a workaround when I have some time.

@@ -73,6 +77,7 @@ def create_flags():
tf.app.flags.DEFINE_boolean ('export_tflite', False, 'export a graph ready for TF Lite engine')
tf.app.flags.DEFINE_boolean ('use_seq_length', True, 'have sequence_length in the exported graph (will make tfcompile unhappy)')
tf.app.flags.DEFINE_integer ('n_steps', 16, 'how many timesteps to process at once by the export graph, higher values mean more latency')
tf.app.flags.DEFINE_string ('export_model_language', '', 'language the model was trained on. Gets embedded into exported model.')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like we get a different naming, it's confusing, I thought it was the language model we were embedding, I ad t ore-read carefully several time. Can the option be named "add_language_metadata" ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about this:

tf.app.flags.DEFINE_string ('export_language', '', 'language the model was trained on e.g. "en" or "English". Gets embedded into exported model.')

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WFM, feels less confusing

Copy link
Collaborator

@lissyx lissyx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, make sure tests are green, and maybe change the flag name for exporting language metadata

@reuben reuben merged commit 7f6fd8b into master Apr 5, 2019
@reuben
Copy link
Contributor Author

reuben commented Apr 5, 2019

OSX workers are acting up but the relevant code has been exercised in other completed tests. Merged manually to avoid starting another group here in the PR.

@reuben reuben deleted the more-metadata branch April 5, 2019 17:40
@lock
Copy link

lock bot commented May 5, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators May 5, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants