Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DepthAi: Access to NN FP16 output layer causes JVM dump #1155

Closed
ecmnet opened this issue Feb 25, 2022 · 5 comments
Closed

DepthAi: Access to NN FP16 output layer causes JVM dump #1155

ecmnet opened this issue Feb 25, 2022 · 5 comments
Labels

Comments

@ecmnet
Copy link

ecmnet commented Feb 25, 2022

Running a neural network via Java on DepthAI works fine as long as I do not try to access FP16 data.

Example

NNData seg = nn.getNNData(); FloatBuffer output_0 = seg.getLayerFp16("output_0"); float value = output_0.get();

Any access to the data of this Buffer fails and causes a dump of the JVM. The limit and capacity of this buffer seems to be correct
with 10752 bytes (shown as DirectByteBuffer) as I have 2688 FP16 values (2688*4) in the output layer.

Could it be that FP16 -> Java Float is not supported properly?

@saudet
Copy link
Member

saudet commented Feb 25, 2022

Could you try to use FloatPointer instead, something like seg.getLayerFp16(new BytePointer("output_0")).get(0)?

@ecmnet
Copy link
Author

ecmnet commented Feb 25, 2022

@saudet Perfect, this seems to work. What's the background here?

@saudet
Copy link
Member

saudet commented Feb 25, 2022

JavaCPP doesn't allocate memory for new Buffer objects returned, and with the @StdVector annotation, the memory allocated gets released on return of the function. We could allocate memory with ByteBuffer.allocateDirect(), but then there isn't any way to deallocate that memory. We could also use Java arrays, but... I don't think there's a good way to go about this. Pointer is there for cases like that anyway and it works well enough.

@ecmnet
Copy link
Author

ecmnet commented Feb 25, 2022

Thanks for your explanation. I‘ll close the issue.

@saudet
Copy link
Member

saudet commented Feb 28, 2022

I've improved somewhat the mappings for this in commit 0dca099 such that we now get a float[] from getLayerFp16(String). This way we don't have any allocation issues since everything is on the Java heap. Please give it a try with the snapshots: http://bytedeco.org/builds/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants