Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cancellation of Ollama generation #18

Closed
erhant opened this issue May 10, 2024 · 0 comments · Fixed by #51
Closed

Cancellation of Ollama generation #18

erhant opened this issue May 10, 2024 · 0 comments · Fixed by #51
Labels
bug Something isn't working enhancement New feature or request

Comments

@erhant
Copy link
Member

erhant commented May 10, 2024

We should add a tokio::select! right at https://github.com/firstbatchxyz/dkn-compute-node/blob/master/src/compute/ollama.rs#L117 so that if the process is terminated while a search is going on, it should cancel that call and return gracefully.

If Ollama is running on container & is killed anyways, it will terminate with code 500, but we could be more graceful this way.

Our Ollama wrapper should take the token in the struct itself, and this token should be used in setup as well.

@erhant erhant added bug Something isn't working enhancement New feature or request labels May 10, 2024
@erhant erhant mentioned this issue Jun 24, 2024
16 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant