You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ChatGPT looks up wikipedia / geocoder, whatever it needs to do to formulate the STAC query it needs to use to query STAC
Backend calls that STAC endpoint and returns JSON results.
What I would propose:
User types in prompt
Prompt is sent to backend, which calls ChatGPT
ChatGPT looks up wikipedia / geocoder, whatever it needs to do to formulate the STAC query it needs to use to query STAC
Backend then just sends those query parameters to the frontend - so the backend would just send like: {"bbox": [1,1,1,2], "datetime": ["...", "..."], .... }
The frontend would then take those STAC query parameters and call the STAC API from the frontend
Additionally / optionally, the frontend will also take the response from the STAC API, condense it down to a CSV or JSON with just the relevant data, and call a separate /summarize endpoint on the backend.
This summarize endpoint will summarize the STAC result in natural language and return it, and the frontend can display it as a natural language summary of the results.
Some context:
Right now, we are unable to easily get the "summary" for the STAC response because the response is too big and overflows the token limit for ChatGPT. We totally could just do all of this on the backend, but I think it makes sense to split the summary endpoint into a separate endpoint with a separate ChatGPT prompt geared toward summarizing STAC results, and then it also makes sense for the frontend to do the actual STAC query, and offload that from the backend.
@geohacker let's chat - I think this structure would be really helpful going forward and I don't think it should be too big a change:
Backend returns just the STAC query parameters instead of the full JSON
Frontend takes query parameters and calls STAC itself
That would be the immediate change. If you feel like this makes sense I can ticket out this summarization feature separately. I think it will be nice to also have a natural language summary of the results.
The text was updated successfully, but these errors were encountered:
Currently:
What I would propose:
{"bbox": [1,1,1,2], "datetime": ["...", "..."], .... }
/summarize
endpoint on the backend.summarize
endpoint will summarize the STAC result in natural language and return it, and the frontend can display it as a natural language summary of the results.Some context:
summary
endpoint into a separate endpoint with a separate ChatGPT prompt geared toward summarizing STAC results, and then it also makes sense for the frontend to do the actual STAC query, and offload that from the backend.@geohacker let's chat - I think this structure would be really helpful going forward and I don't think it should be too big a change:
That would be the immediate change. If you feel like this makes sense I can ticket out this
summarization
feature separately. I think it will be nice to also have a natural language summary of the results.The text was updated successfully, but these errors were encountered: