-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Having trouble compiling... #1
Comments
Yes. I'll try to make a build script for it. The un-matched versions of tensorflow included by syntaxnet, serving etc. make it very difficult to use this code. Plus the fact that each time you build it takes 40 minutes. Let me see what I can cook up. |
Thanks grateful for any help you can provide |
@xtknight Ok. I've pushed some changes to the README.md with my exact build steps. I followed these steps in a clean checkout and everything worked for me. Can you see if it works for you? |
I followed your instructions but somewhere after compile starts I get an error about gif lib.
What I never understand is that when I get these header errors the header is always in the build tree somewhere...it seems to be a persistent issue I've had. |
I think something is fundamentally buggy in the build system. If I use this command, it compiles fine. I also had a problem where bazel reported git said "connection timed out" when it clearly didn't even attempt a connection. So changing the git repo name back and forth got it to clear some sort of cache and start working again. I hope this comment helps someone else having seemingly nonsensical issues with the build.
I'm using bazel 0.3.0 installed from a deb file on Linux Mint.
Anyway, I appreciate your new guide. It clearly works if we can get past the bazel bugs or environmental issues or whatever we'd like to call them :).. I also found out a way of compiling from within tensorflow_serving and was able to update your patch to the new Tensorflow. It's quite easy and it seemed to work fine without downloading all the external dependencies. Just doing a recurse-submodule pull on serving and adding syntaxnet.bzl in WORKSPACE file within that tree seemed to be fine. I'll post a detailed guide on that, but I also ran into bugs when I did that method too. |
Oh, interesting. I have this in my ~/.bazelrc, which is based on the fact that I usually build inside docker where sandboxing is broken:
I wonder how the "standalone" stuff matters. .. But are you saying that the server is now compiled and working? |
Using 'standalone' I was able to get the parsey_api from my own methods working yesterday and it worked with the client and server. I tried with both an older and newer version of TensorFlow (updated my last comment). But I haven't tried the parsey_api from your new README method yet. I'm going to report on that...it seems to at least run. |
That's good news. Thanks. |
As a matter of fact, everything appears to work fine with your instructions! andy@andy ~/Downloads/syntaxnet/parsey-mcparseface-api/parsey_client $ node ./index.js |
This was my method for getting things working. It involves just using syntaxnet within the tensorflow_serving rather than externally, and then changing the patch a little bit. It may be useful as an alternative method, so I'll just leave that here. Alternative method using tensorflow_serving's tensorflow, syntaxnet, models submodules.Let's call our current directory BASE. $ git clone https://github.com/dmansfield/parsey-mcparseface-api $ git clone --recurse-submodules https://github.com/tensorflow/serving
( make sure to follow installation instructions and install prerequisites like gRPC from here: ) ( now we need to apply an updated version of dmansfield's patch. I have made an updated patch based on github PR 250 to work with tf_models submodule HEAD a4b7bb9a5dd2c021edcd3d68d326255c734d0ef0. it should apply CLEANLY to this revision. if your tf_models is not this revision, check out that revision. ) The updated patch you need is here: pr250-patch-a4b7bb9a.diff.txt The old patch is here: https://patch-diff.githubusercontent.com/raw/tensorflow/models/pull/250.diff $ cd serving/tf_models
(now configure tensorflow) $ cd ../tensorflow
$ cd .. (Add syntaxnet local repo, already included in git of tensorflow serving) $ nano tensorflow_serving/workspace.bzl
(Append build instructions for parsey-api too BUILD file) $ nano tensorflow_serving/example/BUILD
(Copy files from parsey-mcparseface api) $ cp ../parsey-mcparseface-api/parsey_api/parsey_api.* ./tensorflow_serving/example/ parsey_mcparseface.py (for exporting SyntaxNet model) is available here: $ cp ~/Downloads/parsey_mcparseface.py ./tensorflow_serving/example/ $ nano tensorflow_serving/example/parsey_mcparseface.py Fix the module path to contrib...
$ nano tensorflow_serving/example/parsey_api.cc (change the paths of these include files (these are dynamically generatd H files that end up in the same directory as parsey_api.cc))
Next, ...
$ bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_api --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures
(Also compile the parsey_mcparseface.py) $ bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_mcparseface --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures
Try to run the binary. $ bazel-bin/tensorflow_serving/example/parsey_api
That's fine. Now let's make it work. Make sure the following directory tree exists. BASE/serving $ ln -s ./tf_models/syntaxnet/syntaxnet .
Now try the server.. $ ./bazel-bin/tensorflow_serving/example/parsey_api --port=9000 /home/andy/Downloads/syntaxnet/parsey-mcparseface-api/parsey_model
Leave that running and run the nodejs parsey_client in another terminal. ( go to parsey_client folder ) $ cd BASE/parsey-mcparseface-api/parsey_client Make sure to edit the IP and port in index.js to match the port used for the server (127.0.0.1:9000) and install grpc module for nodejs. I had to actually put the grpc folder in my parsey_client directory. I couldn't figure out how else to get things working. When you run node ./index.js on client, the server should print the following:
|
in my case
serving_proto_library(
name = "parsey_api_proto",
srcs = ["parsey_api.proto"],
deps = [
"@syntaxnet//syntaxnet:sentence_proto",
],
has_services = 1,
cc_api_version = 2,
cc_grpc_version = 1,
)
cc_binary(
name = "parsey_api",
srcs = [
"parsey_api.cc",
],
linkopts = ["-lm"],
deps = [
"@grpc//:grpc++",
"@org_tensorflow//tensorflow/core:core_cpu",
"@org_tensorflow//tensorflow/core:framework",
"@org_tensorflow//tensorflow/core:lib",
"@org_tensorflow//tensorflow/core:protos_all_cc",
"@org_tensorflow//tensorflow/core:tensorflow",
"@syntaxnet//syntaxnet:parser_ops_cc",
"@syntaxnet//syntaxnet:sentence_proto",
":parsey_api_proto",
"//tensorflow_serving/servables/tensorflow:session_bundle_config_proto",
"//tensorflow_serving/servables/tensorflow:session_bundle_factory",
"@org_tensorflow//tensorflow/contrib/session_bundle",
"@org_tensorflow//tensorflow/contrib/session_bundle:manifest_proto_cc",
"@org_tensorflow//tensorflow/contrib/session_bundle:signature",
],
)
- #include "tensorflow_serving/session_bundle/manifest.pb.h"
- #include "tensorflow_serving/session_bundle/session_bundle.h"
- #include "tensorflow_serving/session_bundle/signature.h"
+ #include "tensorflow/contrib/session_bundle/manifest.pb.h"
+ #include "tensorflow/contrib/session_bundle/session_bundle.h"
+ #include "tensorflow/contrib/session_bundle/signature.h"
- install node from source https://nodejs.org/en/download/
- install grpc for node https://github.com/grpc/grpc/tree/master/src/node
$ npm install
...
grpc@0.14.1 node_modules/grpc
├── arguejs@0.2.3
├── nan@2.4.0
├── lodash@3.10.1
└── protobufjs@4.1.3 (glob@5.0.15, yargs@3.32.0, bytebuffer@4.1.0, ascli@1.0.0)
$ /usr/local/bin/node index.js
module.js:434
return process.dlopen(module, path._makeLong(filename));
^
Error: Module did not self-register.
at Error (native)
at Object.Module._extensions..node (module.js:434:18)
at Module.load (module.js:343:32)
at Function.Module._load (module.js:300:12)
at Module.require (module.js:353:17)
at require (internal/module.js:12:17)
at Object.<anonymous> (/path/to/parsey-mcparseface-api/parsey_client/node_modules/grpc/src/node/src/grpc_extension.js:38:15)
at Module._compile (module.js:409:26)
at Object.Module._extensions..js (module.js:416:10)
at Module.load (module.js:343:32)
but failed....
so i reinstall grpc :
$ node --version
v4.4.7
$ npm --version
2.15.8
$ npm install grpc
$ npm install
$ cp -rf tf_models/syntaxnet/syntaxnet/sentence.proto ../parsey-mcparseface-api/parsey_client/api/syntaxnet/
# sentence.proto uses syntax 'proto2' and parsey_api.proto uses syntax 'proto3'
# i thought this may cause problem. but it works fine.
|
@dsindex Yes I had some of those issues before. The good news is, I got parsey_mcparseface.py to compile with bazel properly. The trick is to use bazel repository syntaxnet but do NOT name the repository syntaxnet, name it something else like 'org_syntaxnet'. (like my previous guide, but change repo name in workspace.bzl and BUILD files.) Otherwise the pythonpath doesn't work. (Python import module gets confused about path xxx.runfiles/syntaxnet/syntaxnet, it must be xxx.runfiles/something_else/syntaxnet). I will post another small guide about it and then maybe @dmansfield can also put in the README in lieu of the forcing PYTHONPATH method. I updated my guide above, but this is the gist... tensorflow_serving/workspace.bzl
Add to tensorflow_serving/example/BUILD (I don't know if each dep is required, but at least this works)
|
For fixing the concurrent requests problem it seems like just adding this code in parsey_api.cc makes it work fine. Flooded it like crazy and there was no problem after adding this. But I don't know the precise limits. May require further investigation.
.... add:
....
|
Hi all, after fiddling around with all of this for a while, I go the server running, but whenever the client makes a request to it, it will shut down with this error... Any ideas:
|
Help, anyone? |
@mastasky https://github.com/dsindex/syntaxnet/blob/master/README_api.md |
@dsindex: Thanks! You're referencing some files I don't have: api/modified_workspace.bzl where would I find them? |
@dsindex Thanks. Now I get this when executing bazel from the serving directory. |
i modified instructions # you can create a shell script with content below!
$ git clone https://github.com/dsindex/syntaxnet.git work
$ cd work
$ git clone --recurse-submodules https://github.com/tensorflow/serving
# you need to install gRPC properly
# https://tensorflow.github.io/serving/setup
# if you have a trouble, see https://github.com/dsindex/tensorflow#tensorflow-serving
# apply patch by dmansfield to serving/tf_models/syntaxnet
$ cd serving/tf_models
$ patch -p1 < ../../api/pr250-patch-a4b7bb9a.diff.txt
$ cd ../../
# configure serving/tensorflow
$ cd serving/tensorflow
$ ./configure
$ cd ../../
# modify serving/tensorflow_serving/workspace.bzl for referencing syntaxnet
$ cp api/modified_workspace.bzl serving/tensorflow_serving/workspace.bzl
$ cat api/modified_workspace.bzl
# ...
# native.local_repository(
# name = "syntaxnet",
# path = workspace_dir + "/tf_models/syntaxnet",
# )
# ...
# append build instructions to serving/tensorflow_serving/example/BUILD
$ cat api/append_BUILD >> serving/tensorflow_serving/example/BUILD
# copy parsey_api.cc, parsey_api.proto to example directory to build
$ cp api/parsey_api* serving/tensorflow_serving/example/
# build parsey_api
$ cd serving
$ bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_api --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures
# make softlink for referencing 'syntaxnet/models/parsey_mcparseface/context.pbtxt'
$ ln -s ./tf_models/syntaxnet/syntaxnet syntaxnet
# run parsey_api with exported model
$ ./bazel-bin/tensorflow_serving/example/parsey_api --port=9000 ../api/parsey_model and i found the reason of "Not found: syntaxnet/models/parsey_mcparseface/context.pbtxt" problem. you need to make a softlink. $ cd serving
$ ln -s ./tf_models/syntaxnet/syntaxnet syntaxnet |
@dsindex: You are a star! It works perfectly! Thanks so much, this is so much easier than any other solution. |
I propose using gist.github.com to keep scripts in sync. The gists can be forked / updated /worked on together. I drafted this updated Dockerfile which does build as long as you throw enough RAM at it. Need to merge this with updated @dsindex script above (in progress) |
@dsindex I'm getting this error
|
it was same issue dsindex/syntaxnet#10 so, i updated README_api.md
updated! (2016. 10. 19. 11:27)
|
thanks @dsindex |
UPDATE - finally building thanks to @dsindex @xiamx https://gist.github.com/johndpope/fc1c2327a4ae255d9c44dda9b67b5288 Optional Docker file |
@dsindex hey thanks. I'm using @johndpope 's script. @johndpope Sorry, I tried the new script. It's still giving the same error. |
@dsindex it worked :) Thanks a lot. @johndpope additionally i had to install gRPC and your script works fine after that. Thanks a lot :) |
@xtknight The patch pr250-patch-a4b7bb9a.diff.txt no longer work for the latest syntaxnet, any updated one? Thanks |
I've been struggling with trying to get this to compile for days. (Well there are so many errors I've gotten it probably wouldn't be useful to list them all.) But it seems like tensorflow_serving uses a different version of tensorflow than syntaxnet, and when it gets to the linking stage I get a bunch of linking errors on tensorflow .so files.
Additionally, the tensorflow that syntaxnet uses seems to widely vary from the one tensorflow_serving uses. For one, protobuf is completely missing.
http://stackoverflow.com/questions/37799563/how-do-i-register-custom-op-actually-from-syntaxnet-with-tensorflow-serving
So the next thing I tried was isolating parsey_api and trying to get it to compile in the tensorflow_examples tree and used the advice above to try to pull in syntaxnet to the serving WORKSPACE. I had some success but I'm stuck with this error:
Do you have instructions on exactly what versions of all the components I need? Or is there any way you can come up with a script that pulls the proper revisions from git? All I want is to have something that just works and it doesn't need to be the latest versions of everything.
I'm not sure if there is an error in your link but I got confused from this step.
It would be nice to have an overview tree diagram of a working configuration from all directory roots and working revisions of each submodule specifically. Because there's nested tensorflow and syntaxnet folders everywhere I'm getting dyslexic..
Thanks for your hard work!
The text was updated successfully, but these errors were encountered: