Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CHORE]: Increase Ollama timeout #1585

Closed
Nathan8489 opened this issue May 31, 2024 · 4 comments
Closed

[CHORE]: Increase Ollama timeout #1585

Nathan8489 opened this issue May 31, 2024 · 4 comments
Assignees
Labels
enhancement New feature or request

Comments

@Nathan8489
Copy link

How are you running AnythingLLM?

Docker (local)

What happened?

os: linux
version: mintplexlabs/anythingllm@sha256:1d994f027b5519d4bc5e1299892e7d0be1405308f10d0350ecefc8e717d3154f (latest version before lancedb dep bump )

problem from: server/utils/EmbeddingEngines/ollama/index.js

The function embedChunks tries to send all text chunks in one connection. However, if all fetches cannot complete within 5 minutes, something (possibly Node.js) determines that the connection has timed out and terminates it, causing it to fail.

I temporarily resolved this by awaiting each chunk fetch individually (40 seconds per chunk, 8192 tokens).

Are there known steps to reproduce?

  • Get a really slow Ollama service (CPU).
  • Set the embedding provider to Ollama.
  • Try to move a large document to the workspace.
  • Wait for the embedding process...
  • It failed exactly after 5 minutes.
@Nathan8489 Nathan8489 added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label May 31, 2024
@timothycarambat timothycarambat changed the title [BUG]: if Ollama embed cannot complete in 5 minutes. Ollama Failed to embed:[undefined]: undefined [CHORE]: Increase Ollama timeout Jun 1, 2024
@timothycarambat timothycarambat added enhancement New feature or request and removed possible bug Bug was reported but is not confirmed or is unable to be replicated. labels Jun 1, 2024
@mingLvft
Copy link

mingLvft commented Jun 5, 2024

Has the problem been solved?

@timothycarambat
Copy link
Member

There is no PR and the issue is open - so no.

@travisgu
Copy link

travisgu commented Jun 17, 2024

I encountered same issue. The Ollama service returned 500 error at exact 5 mintues because the http request was cancelled. So it can only embedding small PDF files.

Is there any config file I can increase the timeout limit?

@timothycarambat
Copy link
Member

Moving conversation to #1585

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants