Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Pytorch] Fix StringSupplier and TensorIdGetter returning pointers to local variables #1391

Merged
merged 4 commits into from
Aug 8, 2023

Conversation

HGuillemet
Copy link
Collaborator

JNI code of function pointers returning a String end with

 return StringAdapter< char >(rptr, rsize, rowner);

Without this @Cast, the function returns a std::basic_string<char>& and the cast from StringAdapter returns a reference to a variable in the stack.
The @Cast makes the functions return a char * and the StringAdapter casting operator now allocates a new string in the heap.

Is there something more definitive to change in Generator about this ?

@HGuillemet HGuillemet changed the title Fix StringSupplier and TensorIdGetter returning pointers to local variables [Pytorch] Fix StringSupplier and TensorIdGetter returning pointers to local variables Jul 25, 2023
@HGuillemet
Copy link
Collaborator Author

Maybe we'd better take and return BytePointer instead of String for function pointers taking a returning a std::string.

@HGuillemet
Copy link
Collaborator Author

@saudet Any comment or could you merge ?

@saudet
Copy link
Member

saudet commented Aug 8, 2023

Looks OK I guess, but could you please fix these things in larger pull requests before merging them instead of doing a lot of mini pull requests like this?

@HGuillemet
Copy link
Collaborator Author

That's the inverse of what you prefer for javacpp repo.

Do you want me to append the commit of PR #1392 in this PR ?

@saudet
Copy link
Member

saudet commented Aug 8, 2023

No, I mean why did you not add that in pull #1360? If you just want to improve things further, yes, just start another mega pull request and modify everything you want in there, and I guess we won't merge it for a few months still before you stop "fixing" things? I mean, if you want to add new features and stuff, let's make multiple pull requests, if you just want to keep fixing things, fix them before merging the relevant pull request.

@HGuillemet
Copy link
Collaborator Author

HGuillemet commented Aug 8, 2023

Well, you did merge PR #1360. And the problems solved by this PR and PR 1392 were found afterward.
Shall I push more commits to the merged PR ?

@saudet
Copy link
Member

saudet commented Aug 8, 2023

No, I just mean try not to merge buggy code that you might want to fix later. If you don't mind that the code is buggy, I guess that's fine, but if you intend to fix it later, instead of opening many mini pull requests to fix everything later, try to not merge buggy code in the first place, that's all.

@HGuillemet
Copy link
Collaborator Author

Ok, but that were not the case. I didn't push a PR containing problems I was aware of and wanted to fix later. Did I ?

@saudet
Copy link
Member

saudet commented Aug 8, 2023

I'm not saying you're doing it intentionally, I'm just saying you're creating a lot of unnecessary commits, so if you could stop doing that it would be nicer.

@saudet saudet merged commit bdba6a2 into bytedeco:master Aug 8, 2023
3 of 5 checks passed
@HGuillemet HGuillemet deleted the hg_fp_returning_string branch August 8, 2023 20:40
@mullerhai
Copy link

Well, you did merge PR #1360. And the problems solved by this PR and PR 1392 were found afterward. Shall I push more commits to the merged PR ?

I want to know how to make pytorch -javacpp train distribute on one more machines,data parallel ? and openmpi mpi we need to bring to javacpp,thanks @HGuillemet @saudet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants