-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can we get batch data use df.to_pandas()
in the case of big data?
#345
Comments
This is a great feature suggestion, would you be interested in implementing this? I can assist if needed. |
Yeah, I am very happy to implement it and pull a requests. Related to this pull requests: Please help to see if the code conforms to the specification, Thanks for your help!
|
Hi guys! |
@NickolayVasilishin Thanks for the suggestion. Chatted with the team about this and I have a few things to report:
|
@sethmlarson thanks for the reply. Yes, exactly, so that's why I'm talking about having a Currently, I'm patching |
@NickolayVasilishin Gotcha, I assumed that's what you meant but maybe I should try doing some testing myself over large data sets. I'll report back with my findings, thanks! Also I wouldn't recommend depending on anything in the |
@sethmlarson thanks! I'd be happy to help with that. Yes, it's pretty clear that this function is private and patching is very dangerous in terms of versions compatibility, I think, so no need for additional marks on that. |
Hello, everybody, There are one question about
DataFrame.to_pandas()
APICan we get batch data use
df.to_pandas()
in the case of big data?For example:
There are 100 million rows data in Elasticsearch
Then please look at the code below
Can we use like this?
Is there any other way to do the same things? Thanks!
The text was updated successfully, but these errors were encountered: