Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update sequential reco models: add serving example #1254

Merged
merged 6 commits into from
Dec 2, 2020

Conversation

Leavingseason
Copy link
Collaborator

@Leavingseason Leavingseason commented Nov 30, 2020

Description

Update deeprec's sequential recommender packages. Provide an example for users how can we use the trained model for serving purpose.

Related Issues

#1233
#1068

Checklist:

  • I have followed the contribution guidelines and code style for this project.
  • I have added tests covering my contributions.
  • I have updated the documentation accordingly.
  • This PR is being made to staging and not master.

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@miguelgfierro
Copy link
Collaborator

hey @Leavingseason this is awesome, but the notebook is not loading for me, I think it might be because it has too many logs.

Do you think we can clean or reduce the output of the cell:

G = tf.Graph()
with tf.gfile.GFile(
        os.path.join(hparams.MODEL_DIR, "serving_model.pb"),
        'rb'
) as f, G.as_default():
    graph_def_optimized = tf.GraphDef()
    graph_def_optimized.ParseFromString(f.read())
    print('graph_def_optimized = ' + str(graph_def_optimized))


with tf.Session(graph=G) as sess:
    tf.import_graph_def(graph_def_optimized)

    model = LoadFrozedPredModel(sess.graph)
    
    serving_output_file = os.path.join(data_path, r'output_serving.txt')  
    iterator = input_creator(hparams, tf.Graph())
    infer_as_serving(model, test_file, serving_output_file, hparams, iterator, sess)

@Leavingseason
Copy link
Collaborator Author

hey @Leavingseason this is awesome, but the notebook is not loading for me, I think it might be because it has too many logs.

Do you think we can clean or reduce the output of the cell:

G = tf.Graph()
with tf.gfile.GFile(
        os.path.join(hparams.MODEL_DIR, "serving_model.pb"),
        'rb'
) as f, G.as_default():
    graph_def_optimized = tf.GraphDef()
    graph_def_optimized.ParseFromString(f.read())
    print('graph_def_optimized = ' + str(graph_def_optimized))


with tf.Session(graph=G) as sess:
    tf.import_graph_def(graph_def_optimized)

    model = LoadFrozedPredModel(sess.graph)
    
    serving_output_file = os.path.join(data_path, r'output_serving.txt')  
    iterator = input_creator(hparams, tf.Graph())
    infer_as_serving(model, test_file, serving_output_file, hparams, iterator, sess)

Sure. I have removed the printing of graph content

Copy link
Collaborator

@miguelgfierro miguelgfierro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome!

@miguelgfierro miguelgfierro merged commit 7171337 into staging Dec 2, 2020
@miguelgfierro miguelgfierro deleted the deeprec/seqreco_update branch December 2, 2020 14:57
@aidenpearce001
Copy link

Why i get error AssertionError: pred is not in graph, when running this code
soemthing

@Leavingseason
Copy link
Collaborator Author

@aidenpearce001 are you using the latest code? In this PR, I update the base model with one line of code "pred = tf.identity(pred, name='pred')"

@aidenpearce001
Copy link

@aidenpearce001 are you using the latest code? In this PR, I update the base model with one line of code "pred = tf.identity(pred, name='pred')"

Thanks you for helping me. I want to ask about model output when using model.predict(), its just return a number but i want it to return probability for each items so i can recommend a list of item for user. How can i do it?

@Leavingseason
Copy link
Collaborator Author

Each line is an instance of <user, item> pair. So enumerate all the items you need to make prediction for the user. E.g., if you have 100 items to rate, you need to generate 100 lines of <user, item> pairs. SLi-Rec model is for ranking purpose, not suitable for item retrieval.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants