-
Notifications
You must be signed in to change notification settings - Fork 737
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Pytorch] How to create the Example ExampleStack ExampleIterator from tensor? #1273
Comments
I'm sure @HGuillemet or @sbrunk have some sample code for that somewhere? |
I don't. I use my own utility classes for this kind of features. |
Same for
But then I don't think you can use an iterator on a stack. A stack is an example built from an array of examples. |
thanks ,just I want |
let me see, the leave question [ExampleIterator] how to use ? |
@mullerhai have a look at MNIST example from @saudet on how you can use the javacpp-presets/pytorch/samples/SimpleMNIST.java Lines 52 to 73 in e6140a9
|
Or in Scala it would roughly look like this: var it = dataLoader.begin()
while (!it.equals(dataLoader.end())) {
val batch = it.access
// do training step
it = it.increment()
} |
Example(Tensor data, Tensor target). meet compile error , and Example not have this constructor , how to do that? |
I need you help, I meet many error ,please show me one completable code template from tensor create to Example ,thanks
error console
|
We can now use ChunkDataReader and ChunkDataset for this now, see commit fa4dfdc. |
Duplicate of #1215 |
I want to know when we will release these |
Sometime next year. In the meantime, snapshots are always available: http://bytedeco.org/builds/ |
OK, I will waiting for release ,If convenient ,please add sequeeceSampler this class togother,thanks |
SequentialSampler? It's already there: https://github.com/bytedeco/javacpp-presets/blob/master/pytorch/src/gen/java/org/bytedeco/pytorch/SequentialSampler.java |
so wonderful , feel exciting, if one day the torch-serve could support the javacpp-pytorch , it is will a completed ml pipeline for java or scala ml env |
I think it's more likely to get it integrated into DJL than TorchServe, but either way someone needs to spend time (that is money) on this... |
@sbrunk What do you think we should be doing for serving? We need to reuse something that already exists... |
I think until we're able to export a model to TorchScript, we can't use an existing serving library. Both TorchServe and DJL for example need a TorchScript model (or a pure Python model for TorchServe) for inference. As far as I understand, the C++ API currently does not support methods like tracing or scripting to produce TorchScript modules. It might be possible to create them manually via the API though, so it could be possible to build tracing functionality. HaskTorch which also uses libtorch seems to support tracing, but I guess it's a non-trivial effort. It's always possible though to use any JVM REST/GRPC/... framework to build a service doing inference on a model written in Java to provide a simple serving solution. |
Actually, I might be wrong for DJL. While it seems to only supports TorchScript and other standard model formats out of the box, perhaps providing an integration with JavaCPP PyTorch based models is not too difficult. |
Right, the APIs of |
I'm sorry, I don't have context here. Are you trying to train your model in java? DJL do support training in Java, you can try it out. But we do focus on optimization for inference. If you are interested in Serving, you might want to take a look DJLServing, DJLServing is a super set of TorchServe, it can even serve |
Not just train models, but create them in Java as well. DJL doesn't support enough features to make it useful for that.
Right, that's what I thought. So if someone wanted to serve a model created with the C++ API of PyTorch, that someone would first need to update that stuff to make it work with such a model, right? |
Hi,
without the dataset , I need to create the class Example instance ,but meet error ,I don't find the tutorial for this.please
error log
The text was updated successfully, but these errors were encountered: