Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Linux Desktop App - Ollama local - Could not respond to message error regardless of LLM Preferences successful setup #1802

Closed
liberateyourtech opened this issue Jul 2, 2024 · 3 comments
Labels
Desktop OS: Linux possible bug Bug was reported but is not confirmed or is unable to be replicated.

Comments

@liberateyourtech
Copy link

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

Loving this app, thank you for the great work! But so far not been able to get the Linux client to work

Installed via latest script without errors and lunched from the command line, configured Ollama local with URL, model and token successfully, created workspace ok, send "hello" immediately get error:

Could not respond to message.
An error occurred while streaming response. network error

Unable to access settings again after above error, app hangs

My setup:
Running Ollama locally on Ubuntu 22 with ./ollama serve, flags for LAN access enabled, browser confirms ollama is running on

http://127.0.0.1:11434/
and
http://192.168.100.144:11434/ (IP of same machine on network)

I am able to connect and chat via windows Anything LLM desktop app over LAN

Console output from Anything LLM:

libre@AI-ASR:/$ ./home/libre/AnythingLLMDesktop/start
[Preferences] preference config stored at /home/libre/.config/anythingllm-desktop/config.json
[44716:0702/163258.555428:ERROR:object_proxy.cc(590)] Failed to call method: org.freedesktop.portal.Settings.Read: object_path= /org/freedesktop/portal/desktop: org.freedesktop.DBus.Error.ServiceUnknown: The name org.freedesktop.portal.Desktop was not provided by any .service files
Prisma schema loaded from prisma/schema.prisma
Datasource "db": SQLite database "anythingllm.db" at "file:/home/libre/.config/anythingllm-desktop/storage/anythingllm.db"

Already in sync, no schema change or pending migration was found.

✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 347ms

Prisma schema loaded from prisma/schema.prisma

✔ Generated Prisma Client (v5.3.1) to ./node_modules/@prisma/client in 334ms

Start using Prisma Client in Node.js (See: https://pris.ly/d/client)

import { PrismaClient } from '@prisma/client'
const prisma = new PrismaClient()

or start using Prisma Client at the edge (See: https://pris.ly/d/accelerate)

import { PrismaClient } from '@prisma/client/edge'
const prisma = new PrismaClient()

See other ways of importing Prisma Client: http://pris.ly/d/importing-client

[OllamaProcessManager] Ollama will bind on port 38677 when booted.
[Preferences] Will load window with last know bounds.
[44750:0702/163306.278799:ERROR:gl_surface_presentation_helper.cc(260)] GetVSyncParametersIfAvailable() failed for 1 times!
[44750:0702/163319.458568:ERROR:gl_surface_presentation_helper.cc(260)] GetVSyncParametersIfAvailable() failed for 2 times!
[44750:0702/163327.965429:ERROR:gl_surface_presentation_helper.cc(260)] GetVSyncParametersIfAvailable() failed for 3 times!

Console output from running Ollama:

llama_new_context_with_model: graph splits = 1
INFO [main] model loaded | tid="130192473569152" timestamp=1719925470
time=2024-07-02T14:04:31.064+01:00 level=INFO source=server.go:572 msg="llama runner started in 1.26 seconds"
[GIN] 2024/07/02 - 14:04:31 | 200 | 4.286584256s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/02 - 14:08:21 | 200 | 2m25s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/02 - 14:09:54 | 200 | 1.148565ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:12:25 | 200 | 1.168343ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:13:15 | 200 | 1.195207ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:18:01 | 200 | 1.441114ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:18:30 | 200 | 1.703561ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:21:43 | 200 | 1.503829ms | 192.168.100.165 | GET "/api/tags"
[GIN] 2024/07/02 - 14:34:29 | 200 | 1.199092ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 14:34:29 | 200 | 1.121646ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 15:32:21 | 200 | 1.112652ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 15:33:37 | 200 | 99.496µs | 192.168.100.144 | GET "/"
[GIN] 2024/07/02 - 15:33:48 | 200 | 500.678µs | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 15:33:59 | 200 | 1.45739ms | 192.168.100.144 | GET "/api/tags"
[GIN] 2024/07/02 - 16:17:02 | 200 | 1.41016ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 16:18:08 | 200 | 1.10151ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/02 - 16:33:09 | 200 | 1.062379ms | 127.0.0.1 | GET "/api/tags"

It looks like Ollama is getting GET api/tags

Is the Desktop app correct? [OllamaProcessManager] Ollama will bind on port 38677 when booted.

I have tried

Quit and relaunch the app
Quit and relaunch, reset LLM Preferences succesfully
Deleting the folder in .config and setup again

I tried installing the same Linux Desktop app on another machine on the network, same errors

Thank you!

Are there known steps to reproduce?

No response

@liberateyourtech liberateyourtech added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label Jul 2, 2024
@timothycarambat
Copy link
Member

timothycarambat commented Jul 2, 2024

So the built-in Ollama (AnythingLLM LLM) should not run on linux and that may be the root cause. Are you using the AnythingLLM LLM or Ollama as the LLM Provider?

The Ollama LLM provider option is supposed to connect to a localhost ollama that you are running outside of the context of the AnythingLLM process, which sounds to be the case. The log

[OllamaProcessManager] Ollama will bind on port 38677 when booted.

though indicates that it did try to boot the internal Ollama running - which would fail on Linux

@liberateyourtech
Copy link
Author

I am using the Ollama LLM, not AnythingLLM LLM (screenshot below)

image

Normally I would see POST messages in the Ollama console when succesfully chatting with Ollama, as below

image

Thank you!

@timothycarambat
Copy link
Member

And after sending a chat, what does the logs show in AnythingLLM and does a POST /api/chat in Ollama as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Desktop OS: Linux possible bug Bug was reported but is not confirmed or is unable to be replicated.
Projects
None yet
Development

No branches or pull requests

2 participants