Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

do not depend on libraries installed with brew on macOS: include libraries in binary #2562

Closed
qdrddr opened this issue Jun 13, 2024 · 10 comments · Fixed by #2567
Closed

do not depend on libraries installed with brew on macOS: include libraries in binary #2562

qdrddr opened this issue Jun 13, 2024 · 10 comments · Fixed by #2567
Labels
enhancement New feature or request roadmap

Comments

@qdrddr
Copy link

qdrddr commented Jun 13, 2024

Most of nowadays Macs are quite powerful in comparison to PCs, some might have a 60-core GPU and up to 196GB shared RAM between GPU & CPU making this ideal for LocalAI inference.

Also consider this: Apple is developing Apple Silicon designed specifically for server farms dedicated to AI processing under the internal name Project ACDC. The company aims to optimize AI applications within its data centers for future versions of its platforms.

Is your feature request related to a problem? Please describe.
The problem with running LocalAI binary on macOS is its constantly misaligned library versions required by LocalAI with the library available on Mac installed with brew, hope that part can be improved in LocalAI. Taking into account that brew can update libraries. So practically I cannot run LocalAI on my Mac as Binary since it does not work most of the time due to the versions of libraries being misaligned or either in the Docker since Metal is not supported.

Describe the solution you'd like
macOS binary that includes all dependencies and libraries and does not depend on libraries installed with brew.

Describe alternatives you've considered
Trying to fix dependencies on macOS typically complicated for most users. Ending up searching with other inferences that can run natively.

Additional context

For example problem with libprotobuf where my brew installed version 27.0 and the binary and source require 26.1 or 27.1.

@qdrddr qdrddr added the enhancement New feature or request label Jun 13, 2024
@dave-gray101
Copy link
Collaborator

Personally, I like using brew dependencies. It keeps our build times a ton shorter, and provides a forcing function that keeps us up to date.

I'm open to other opinions, but personally I'd consider the bug here to be that we need to bump the golang protobuf version again - that fixes this problem in the past.

@dave-gray101
Copy link
Collaborator

I've also noticed that there's an open issue to start distributing LocalAI via brew itself - it may be worth wading into that and seeing if I can get us up and running from that front as well

@qdrddr
Copy link
Author

qdrddr commented Jun 13, 2024

The issue arises from the fact that the binary on macOS cannot be dockerized. I understand that approach makes it easier for the LocalAI team to build, please consider also user experience. From the user experience, without containerization it makes the app unreliable, speaking of other inference alternatives.

Hope some middle ground can be found to improve user experience without making LocaAI's team life difficult. @dave-gray101

@mudler
Copy link
Owner

mudler commented Jun 13, 2024

@qdrddr I don't have a mac, but if you can tell what breaks and the libraries required we can include that as part of the binary - that would speed up things instead of me trying to figure out that from GHA runners during the build.

I've already started working on having the LocalAI binary to carry over libraries to solve incompatibility issues with arm64, and GPU acceleration, so the mechanism of shipping the libraries is already in place here:

// If there is a lib directory, set LD_LIBRARY_PATH to include it

That means that during macOS builds we just need to carry over the shared libraries in the backend_assets/libs folder before compiling the go binary, and would be automatically picked up then.

@mudler mudler added the roadmap label Jun 13, 2024
@qdrddr
Copy link
Author

qdrddr commented Jun 13, 2024

When I try running the binary

curl -sL https://github.com/mudler/LocalAI/releases/download/v2.16.0/local-ai-Darwin-arm64 ~/local-ai-Darwin-arm64

./local-ai-Darwin-arm64 --models-path=~/models/ --debug=true --address 127.0.0.1:8090 --parallel-requests true

The app starts. But when I do the call to the this model

curl -X POST "http://localhost/v1/chat/completions" \                                                                                                                                                                                                                             
     -H "Content-Type: application/json" \
     -H "Authorization: Bearer YOUR_API_KEY" \
     -d '{
           "model": "llama3-8b-instruct",
           "messages": [
               {
                   "role": "user", 
                   "content": "Write a simple python code that creates a webserwer on port 8051 and outputs Once upon a time, there was a cat."
                }
           ]
         }'

I'm getting an error message:

12:28PM DBG Loading from the following backends (in order): [llama-cpp llama-ggml gpt4all llama-cpp-fallback rwkv whisper default.metallib huggingface bert-embeddings]
12:28PM INF Trying to load the model 'Meta-Llama-3-8B-Instruct.Q4_0.gguf' with the backend '[llama-cpp llama-ggml gpt4all llama-cpp-fallback rwkv whisper default.metallib huggingface bert-embeddings]'
12:28PM INF [llama-cpp] Attempting to load
12:28PM INF Loading model 'Meta-Llama-3-8B-Instruct.Q4_0.gguf' with backend llama-cpp
12:28PM DBG Loading model in memory from file: ~/models/localai_models/Meta-Llama-3-8B-Instruct.Q4_0.gguf
12:28PM DBG Loading Model Meta-Llama-3-8B-Instruct.Q4_0.gguf with gRPC (file: ~/models/localai_models/Meta-Llama-3-8B-Instruct.Q4_0.gguf) (backend: llama-cpp): {backendString:llama-cpp model:Meta-Llama-3-8B-Instruct.Q4_0.gguf threads:4 assetDir:/tmp/localai/backend_data context:{emptyCtx:{}} gRPCOptions:0x1400038c008 externalBackends:map[] grpcAttempts:20 grpcAttemptsDelay:2 singleActiveBackend:false parallelRequests:false}
12:28PM INF [llama-cpp] attempting to load with fallback variant
12:28PM DBG Loading GRPC Process: /tmp/localai/backend_data/backend-assets/grpc/llama-cpp-fallback
12:28PM DBG GRPC Service for Meta-Llama-3-8B-Instruct.Q4_0.gguf will be running at: '127.0.0.1:59047'
12:28PM DBG GRPC Service state dir: /var/folders/5h/qvzp0mfx2jd1rsxg4f9z91880000gn/T/go-processmanager2332558324
12:28PM DBG GRPC Service Started
12:28PM DBG GRPC(Meta-Llama-3-8B-Instruct.Q4_0.gguf-127.0.0.1:59047): stderr dyld[40906]: Library not loaded: /opt/homebrew/opt/protobuf/lib/libprotobuf.26.1.0.dylib
12:28PM DBG GRPC(Meta-Llama-3-8B-Instruct.Q4_0.gguf-127.0.0.1:59047): stderr   Referenced from: <2CDD1D3F-CC9B-318B-B3F8-5737F43A0E19> /private/tmp/localai/backend_data/backend-assets/grpc/llama-cpp-fallback
12:28PM DBG GRPC(Meta-Llama-3-8B-Instruct.Q4_0.gguf-127.0.0.1:59047): stderr   Reason: tried: '/opt/homebrew/opt/protobuf/lib/libprotobuf.26.1.0.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/opt/protobuf/lib/libprotobuf.26.1.0.dylib' (no such file), '/opt/homebrew/opt/protobuf/lib/libprotobuf.26.1.0.dylib' (no such file), '/opt/homebrew/Cellar/protobuf/27.0/lib/libprotobuf.26.1.0.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/Cellar/protobuf/27.0/lib/libprotobuf.26.1.0.dylib' (no such file), '/opt/homebrew/Cellar/protobuf/27.0/lib/libprotobuf.26.1.0.dylib' (no such file)
ls /opt/homebrew/Cellar/protobuf/27.0/lib
cmake                         libprotobuf-lite.27.0.0.dylib libprotobuf-lite.dylib        libprotobuf.27.0.0.dylib      libprotobuf.dylib             libprotoc.27.0.0.dylib        libprotoc.dylib               libupb.a                      libutf8_range.a               libutf8_validity.a            pkgconfig

protoc --version
libprotoc 27.0

brew info protobuf
==> protobuf: stable 27.0 (bottled)
Protocol buffers (Google's data interchange format)
https://protobuf.dev/
Installed
/opt/homebrew/Cellar/protobuf/27.0 (430 files, 14.6MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 10:57:33
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protobuf.rb
License: BSD-3-Clause
==> Dependencies
Build: cmake ✔, googletest ✘
Required: abseil ✔
==> Caveats
Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/protobuf
==> Analytics
install: 57,376 (30 days), 164,241 (90 days), 694,322 (365 days)
install-on-request: 25,605 (30 days), 77,760 (90 days), 324,072 (365 days)
build-error: 241 (30 days)

So I cannot either run the precompiled released binary. Not built from source.
And the most annoying is it was working before, and what changed is probably I have updated brew.

So technically it's my problem, but I'm not the only one. A couple of developers I'm working with having exactly the same problem on their macs. And here is also LocalAI/issues/2397.

And I was looking at how to improve user experience on macOS.

my versions

brew install abseil cmake go grpc protobuf protoc-gen-go protoc-gen-go-grpc python wget 
brew info abseil cmake go grpc protobuf protoc-gen-go protoc-gen-go-grpc python wget
Warning: Treating cmake as a formula. For the cask, use homebrew/cask/cmake or specify the `--cask` flag.
==> abseil: stable 20240116.2 (bottled), HEAD
C++ Common Libraries
https://abseil.io
Installed
/opt/homebrew/Cellar/abseil/20240116.2 (748 files, 10.9MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:40
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/a/abseil.rb
License: Apache-2.0
==> Dependencies
Build: cmake ✔, googletest ✘
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 45,189 (30 days), 141,009 (90 days), 460,866 (365 days)
install-on-request: 654 (30 days), 3,364 (90 days), 8,583 (365 days)
build-error: 126 (30 days)

==> cmake: stable 3.29.5 (bottled), HEAD
Cross-platform make
https://www.cmake.org/
Installed
/opt/homebrew/Cellar/cmake/3.29.5 (3,385 files, 55.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:48
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/c/cmake.rb
License: BSD-3-Clause
==> Options
--HEAD
	Install HEAD version
==> Caveats
To install the CMake documentation, run:
  brew install cmake-docs

Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/cmake
==> Analytics
install: 135,266 (30 days), 417,197 (90 days), 1,603,918 (365 days)
install-on-request: 104,015 (30 days), 326,253 (90 days), 1,257,466 (365 days)
build-error: 457 (30 days)

==> go: stable 1.22.4 (bottled), HEAD
Open source programming language to build simple/reliable/efficient software
https://go.dev/
Installed
/opt/homebrew/Cellar/go/1.22.4 (12,859 files, 250.8MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:57
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/g/go.rb
License: BSD-3-Clause
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 91,898 (30 days), 260,369 (90 days), 1,070,991 (365 days)
install-on-request: 65,527 (30 days), 187,150 (90 days), 787,004 (365 days)
build-error: 115 (30 days)

==> grpc: stable 1.62.2 (bottled), HEAD
Next generation open source RPC library and framework
https://grpc.io/
Installed
/opt/homebrew/Cellar/grpc/1.62.2_1 (373 files, 25.4MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:48
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/g/grpc.rb
License: Apache-2.0
==> Dependencies
Build: autoconf ✔, automake ✔, cmake ✔, libtool ✔
Required: abseil ✔, c-ares ✔, openssl@3 ✔, protobuf ✔, re2 ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 12,857 (30 days), 40,491 (90 days), 173,663 (365 days)
install-on-request: 6,841 (30 days), 23,263 (90 days), 91,924 (365 days)
build-error: 13 (30 days)

==> protobuf: stable 27.0 (bottled)
Protocol buffers (Google's data interchange format)
https://protobuf.dev/
Installed
/opt/homebrew/Cellar/protobuf/27.0 (430 files, 14.6MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 10:57:33
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protobuf.rb
License: BSD-3-Clause
==> Dependencies
Build: cmake ✔, googletest ✘
Required: abseil ✔
==> Caveats
Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/protobuf
==> Analytics
install: 57,361 (30 days), 164,255 (90 days), 694,443 (365 days)
install-on-request: 25,601 (30 days), 77,764 (90 days), 324,117 (365 days)
build-error: 240 (30 days)

==> protoc-gen-go: stable 1.34.2 (bottled), HEAD
Go support for Google's protocol buffers
https://github.com/protocolbuffers/protobuf-go
Installed
/opt/homebrew/Cellar/protoc-gen-go/1.34.2 (6 files, 4.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:53
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protoc-gen-go.rb
License: BSD-3-Clause
==> Dependencies
Build: go ✔
Required: protobuf ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 1,547 (30 days), 4,473 (90 days), 13,584 (365 days)
install-on-request: 1,493 (30 days), 4,413 (90 days), 13,489 (365 days)
build-error: 0 (30 days)

==> protoc-gen-go-grpc: stable 1.4.0 (bottled)
Protoc plugin that generates code for gRPC-Go clients
https://github.com/grpc/grpc-go
Installed
/opt/homebrew/Cellar/protoc-gen-go-grpc/1.4.0 (8 files, 6.3MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:54
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protoc-gen-go-grpc.rb
License: Apache-2.0
==> Dependencies
Build: go ✔
Required: protobuf ✔
==> Analytics
install: 977 (30 days), 1,656 (90 days), 4,830 (365 days)
install-on-request: 927 (30 days), 1,597 (90 days), 4,740 (365 days)
build-error: 7 (30 days)

==> python@3.12: stable 3.12.3 (bottled)
Interpreted, interactive, object-oriented programming language
https://www.python.org/
Installed
/opt/homebrew/Cellar/python@3.12/3.12.3 (3,272 files, 65.7MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:36:38
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/python@3.12.rb
License: Python-2.0
==> Dependencies
Build: pkg-config ✔
Required: mpdecimal ✔, openssl@3 ✔, sqlite ✔, xz ✔
==> Caveats
Python has been installed as
  /opt/homebrew/bin/python3

Unversioned symlinks `python`, `python-config`, `pip` etc. pointing to
`python3`, `python3-config`, `pip3` etc., respectively, have been installed into
  /opt/homebrew/opt/python@3.12/libexec/bin

See: https://docs.brew.sh/Homebrew-and-Python
==> Analytics
install: 267,396 (30 days), 823,812 (90 days), 1,676,250 (365 days)
install-on-request: 82,613 (30 days), 253,162 (90 days), 397,547 (365 days)
build-error: 1,471 (30 days)

==> wget: stable 1.24.5 (bottled), HEAD
Internet file retriever
https://www.gnu.org/software/wget/
Installed
/opt/homebrew/Cellar/wget/1.24.5 (92 files, 4.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:36:58
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/w/wget.rb
License: GPL-3.0-or-later
==> Dependencies
Build: pkg-config ✔
Required: libidn2 ✔, openssl@3 ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 102,900 (30 days), 362,113 (90 days), 998,447 (365 days)
install-on-request: 102,618 (30 days), 361,380 (90 days), 996,569 (365 days)
build-error: 18 (30 days)

@qdrddr
Copy link
Author

qdrddr commented Jun 13, 2024

Abd when I try to make build

make BUILD_GRPC_FOR_BACKEND_LLAMA=true build
conda activate localai12
conda info                                                                                                                                                                                                                                                                                                                
     active environment : localai12
    active env location : /opt/anaconda3/envs/localai12
            shell level : 12
       user config file : ~/.condarc
 populated config files : ~/.condarc
          conda version : 24.3.0
    conda-build version : 24.1.2
         python version : 3.11.7.final.0
                 solver : libmamba (default)
       virtual packages : __archspec=1=m1
                          __conda=24.3.0=0
                          __osx=14.5=0
                          __unix=0=0
       base environment : /opt/anaconda3  (writable)
      conda av data dir : /opt/anaconda3/etc/conda
  conda av metadata url : None
           channel URLs : https://repo.anaconda.com/pkgs/main/osx-arm64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/osx-arm64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /opt/anaconda3/pkgs
                          ~/.conda/pkgs
       envs directories : /opt/anaconda3/envs
                          ~/.conda/envs
               platform : osx-arm64
             user-agent : conda/24.3.0 requests/2.31.0 CPython/3.11.7 Darwin/23.5.0 OSX/14.5 solver/libmamba conda-libmamba-solver/24.1.0 libmambapy/1.5.6 aau/0.4.3 c/Svs6DeSOU7HTaJYUErQm9A s/ejX86YaR6OnB5ps87RF3iw e/-wjpz_V8KWev6FO4E_EUUQ
                UID:GID : 501:20
             netrc file : None
           offline mode : False
brew install abseil cmake go grpc protobuf protoc-gen-go protoc-gen-go-grpc python wget 
brew info abseil cmake go grpc protobuf protoc-gen-go protoc-gen-go-grpc python wget
Warning: Treating cmake as a formula. For the cask, use homebrew/cask/cmake or specify the `--cask` flag.
==> abseil: stable 20240116.2 (bottled), HEAD
C++ Common Libraries
https://abseil.io
Installed
/opt/homebrew/Cellar/abseil/20240116.2 (748 files, 10.9MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:40
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/a/abseil.rb
License: Apache-2.0
==> Dependencies
Build: cmake ✔, googletest ✘
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 45,189 (30 days), 141,009 (90 days), 460,866 (365 days)
install-on-request: 654 (30 days), 3,364 (90 days), 8,583 (365 days)
build-error: 126 (30 days)

==> cmake: stable 3.29.5 (bottled), HEAD
Cross-platform make
https://www.cmake.org/
Installed
/opt/homebrew/Cellar/cmake/3.29.5 (3,385 files, 55.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:48
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/c/cmake.rb
License: BSD-3-Clause
==> Options
--HEAD
	Install HEAD version
==> Caveats
To install the CMake documentation, run:
  brew install cmake-docs

Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/cmake
==> Analytics
install: 135,266 (30 days), 417,197 (90 days), 1,603,918 (365 days)
install-on-request: 104,015 (30 days), 326,253 (90 days), 1,257,466 (365 days)
build-error: 457 (30 days)

==> go: stable 1.22.4 (bottled), HEAD
Open source programming language to build simple/reliable/efficient software
https://go.dev/
Installed
/opt/homebrew/Cellar/go/1.22.4 (12,859 files, 250.8MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:35:57
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/g/go.rb
License: BSD-3-Clause
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 91,898 (30 days), 260,369 (90 days), 1,070,991 (365 days)
install-on-request: 65,527 (30 days), 187,150 (90 days), 787,004 (365 days)
build-error: 115 (30 days)

==> grpc: stable 1.62.2 (bottled), HEAD
Next generation open source RPC library and framework
https://grpc.io/
Installed
/opt/homebrew/Cellar/grpc/1.62.2_1 (373 files, 25.4MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:48
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/g/grpc.rb
License: Apache-2.0
==> Dependencies
Build: autoconf ✔, automake ✔, cmake ✔, libtool ✔
Required: abseil ✔, c-ares ✔, openssl@3 ✔, protobuf ✔, re2 ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 12,857 (30 days), 40,491 (90 days), 173,663 (365 days)
install-on-request: 6,841 (30 days), 23,263 (90 days), 91,924 (365 days)
build-error: 13 (30 days)

==> protobuf: stable 27.0 (bottled)
Protocol buffers (Google's data interchange format)
https://protobuf.dev/
Installed
/opt/homebrew/Cellar/protobuf/27.0 (430 files, 14.6MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 10:57:33
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protobuf.rb
License: BSD-3-Clause
==> Dependencies
Build: cmake ✔, googletest ✘
Required: abseil ✔
==> Caveats
Emacs Lisp files have been installed to:
  /opt/homebrew/share/emacs/site-lisp/protobuf
==> Analytics
install: 57,361 (30 days), 164,255 (90 days), 694,443 (365 days)
install-on-request: 25,601 (30 days), 77,764 (90 days), 324,117 (365 days)
build-error: 240 (30 days)

==> protoc-gen-go: stable 1.34.2 (bottled), HEAD
Go support for Google's protocol buffers
https://github.com/protocolbuffers/protobuf-go
Installed
/opt/homebrew/Cellar/protoc-gen-go/1.34.2 (6 files, 4.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:53
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protoc-gen-go.rb
License: BSD-3-Clause
==> Dependencies
Build: go ✔
Required: protobuf ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 1,547 (30 days), 4,473 (90 days), 13,584 (365 days)
install-on-request: 1,493 (30 days), 4,413 (90 days), 13,489 (365 days)
build-error: 0 (30 days)

==> protoc-gen-go-grpc: stable 1.4.0 (bottled)
Protoc plugin that generates code for gRPC-Go clients
https://github.com/grpc/grpc-go
Installed
/opt/homebrew/Cellar/protoc-gen-go-grpc/1.4.0 (8 files, 6.3MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-13 at 12:14:54
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/protoc-gen-go-grpc.rb
License: Apache-2.0
==> Dependencies
Build: go ✔
Required: protobuf ✔
==> Analytics
install: 977 (30 days), 1,656 (90 days), 4,830 (365 days)
install-on-request: 927 (30 days), 1,597 (90 days), 4,740 (365 days)
build-error: 7 (30 days)

==> python@3.12: stable 3.12.3 (bottled)
Interpreted, interactive, object-oriented programming language
https://www.python.org/
Installed
/opt/homebrew/Cellar/python@3.12/3.12.3 (3,272 files, 65.7MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:36:38
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/p/python@3.12.rb
License: Python-2.0
==> Dependencies
Build: pkg-config ✔
Required: mpdecimal ✔, openssl@3 ✔, sqlite ✔, xz ✔
==> Caveats
Python has been installed as
  /opt/homebrew/bin/python3

Unversioned symlinks `python`, `python-config`, `pip` etc. pointing to
`python3`, `python3-config`, `pip3` etc., respectively, have been installed into
  /opt/homebrew/opt/python@3.12/libexec/bin

See: https://docs.brew.sh/Homebrew-and-Python
==> Analytics
install: 267,396 (30 days), 823,812 (90 days), 1,676,250 (365 days)
install-on-request: 82,613 (30 days), 253,162 (90 days), 397,547 (365 days)
build-error: 1,471 (30 days)

==> wget: stable 1.24.5 (bottled), HEAD
Internet file retriever
https://www.gnu.org/software/wget/
Installed
/opt/homebrew/Cellar/wget/1.24.5 (92 files, 4.5MB) *
  Poured from bottle using the formulae.brew.sh API on 2024-06-10 at 19:36:58
From: https://github.com/Homebrew/homebrew-core/blob/HEAD/Formula/w/wget.rb
License: GPL-3.0-or-later
==> Dependencies
Build: pkg-config ✔
Required: libidn2 ✔, openssl@3 ✔
==> Options
--HEAD
	Install HEAD version
==> Analytics
install: 102,900 (30 days), 362,113 (90 days), 998,447 (365 days)
install-on-request: 102,618 (30 days), 361,380 (90 days), 996,569 (365 days)
build-error: 18 (30 days)
git clone https://github.com/mudler/LocalAI.git

pip install --user grpcio-tools                                                                                                                                                                                                                                                                                                                      
Requirement already satisfied: grpcio-tools in ~/.local/lib/python3.12/site-packages (1.64.1)
Requirement already satisfied: protobuf<6.0dev,>=5.26.1 in ~/.local/lib/python3.12/site-packages (from grpcio-tools) (5.27.1)
Requirement already satisfied: grpcio>=1.64.1 in ~/.local/lib/python3.12/site-packages (from grpcio-tools) (1.64.1)
Requirement already satisfied: setuptools in /opt/anaconda3/envs/localai12/lib/python3.12/site-packages (from grpcio-tools) (69.5.1)

When I try to build like this,

cd LocalAI
make BUILD_GRPC_FOR_BACKEND_LLAMA=true build

getting an error message, I believe its about the libprotobuf.26.1.0.dylib

[ 95%] Building CXX object examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/backend.grpc.pb.cc.o
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:13:2: error: "This file was generated by a newer version of protoc which is"
#error "This file was generated by a newer version of protoc which is"
 ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:14:2: error: "incompatible with your Protocol Buffer headers. Please update"
#error "incompatible with your Protocol Buffer headers. Please update"
 ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:15:2: error: "your headers."
#error "your headers."
 ^
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:24:
In file included from /opt/homebrew/include/google/protobuf/io/coded_stream.h:112:
In file included from /opt/homebrew/include/absl/strings/cord.h:80:
In file included from /opt/homebrew/include/absl/crc/internal/crc_cord_state.h:23:
In file included from /opt/homebrew/include/absl/crc/crc32c.h:32:
In file included from /opt/homebrew/include/absl/strings/str_format.h:84:
In file included from /opt/homebrew/include/absl/strings/internal/str_format/bind.h:29:
/opt/homebrew/include/absl/strings/internal/str_format/parser.h:225:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::EnsureConstexpr(format),
          ^
/opt/homebrew/include/absl/strings/internal/str_format/parser.h:227:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::ValidFormatImpl<C...>(format),
          ^
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:24:
In file included from /opt/homebrew/include/google/protobuf/io/coded_stream.h:112:
In file included from /opt/homebrew/include/absl/strings/cord.h:80:
In file included from /opt/homebrew/include/absl/crc/internal/crc_cord_state.h:23:
In file included from /opt/homebrew/include/absl/crc/crc32c.h:32:
In file included from /opt/homebrew/include/absl/strings/str_format.h:84:
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:143:11: warning: 'enable_if' is a clang extension [-Wgcc-compat]
          enable_if(str_format_internal::EnsureConstexpr(s), "constexpr trap"),
          ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:149:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(str_format_internal::EnsureConstexpr(s),
                     ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:158:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(ValidFormatImpl<Args...>(s), "bad format trap")))
                     ^
/opt/homebrew/include/absl/strings/internal/str_format/bind.h:162:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(ValidFormatImpl<Args...>(s), "bad format trap")))
                     ^
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:27:
In file included from /opt/homebrew/include/google/protobuf/generated_message_bases.h:18:
In file included from /opt/homebrew/include/google/protobuf/message.h:106:
In file included from /opt/homebrew/include/google/protobuf/descriptor.h:45:
In file included from /opt/homebrew/include/absl/container/btree_map.h:57:
In file included from /opt/homebrew/include/absl/container/internal/btree.h:72:
/opt/homebrew/include/absl/types/compare.h:78:22: warning: 'enable_if' is a clang extension [-Wgcc-compat]
      __attribute__((enable_if(n == 0, "Only literal `0` is allowed."))) {}
                     ^
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:287:3: error: unknown type name 'PROTOBUF_ATTRIBUTE_REINITIALIZES'
  PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
  ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:288:30: error: only virtual member functions can be marked 'final'
  bool IsInitialized() const final;
                             ^~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:291:96: error: only virtual member functions can be marked 'final'
  const char* _InternalParse(const char* ptr, ::google::protobuf::internal::ParseContext* ctx) final;
                                                                                               ^~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:294:29: error: only virtual member functions can be marked 'final'
  int GetCachedSize() const final { return _impl_._cached_size_.Get(); }
                            ^~~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:299:38: error: only virtual member functions can be marked 'final'
  void SetCachedSize(int size) const final;
                                     ^~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:314:52: error: only virtual member functions can be marked 'final'
  ::google::protobuf::Metadata GetMetadata() const final;
                                                   ^~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:213:9: error: use of undeclared identifier 'GetOwningArena'
    if (GetOwningArena() == from.GetOwningArena()
        ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:213:34: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    if (GetOwningArena() == from.GetOwningArena()
                            ~~~~ ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:260:9: error: use of undeclared identifier 'GetOwningArena'
    if (GetOwningArena() == other->GetOwningArena()) {
        ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:260:36: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    if (GetOwningArena() == other->GetOwningArena()) {
                            ~~~~~  ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:269:17: error: use of undeclared identifier 'GetOwningArena'
    ABSL_DCHECK(GetOwningArena() == other->GetOwningArena());
                ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:269:44: error: no member named 'GetOwningArena' in 'backend::RerankRequest'
    ABSL_DCHECK(GetOwningArena() == other->GetOwningArena());
                                    ~~~~~  ^
/opt/homebrew/include/absl/log/absl_check.h:47:34: note: expanded from macro 'ABSL_DCHECK'
  ABSL_LOG_INTERNAL_DCHECK_IMPL((condition), #condition)
                                 ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/check_impl.h:43:41: note: expanded from macro 'ABSL_LOG_INTERNAL_DCHECK_IMPL'
  ABSL_LOG_INTERNAL_CHECK_IMPL(true || (condition), "true")
                                        ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/check_impl.h:27:58: note: expanded from macro 'ABSL_LOG_INTERNAL_CHECK_IMPL'
                                    ABSL_PREDICT_FALSE(!(condition))) \
                                                         ^~~~~~~~~
/opt/homebrew/include/absl/base/optimization.h:178:59: note: expanded from macro 'ABSL_PREDICT_FALSE'
#define ABSL_PREDICT_FALSE(x) (__builtin_expect(false || (x), false))
                                                          ^
/opt/homebrew/include/absl/log/internal/conditions.h:172:40: note: expanded from macro 'ABSL_LOG_INTERNAL_CONDITION_FATAL'
  ABSL_LOG_INTERNAL_##type##_CONDITION(condition)
                                       ^~~~~~~~~
/opt/homebrew/include/absl/log/internal/conditions.h:68:7: note: expanded from macro 'ABSL_LOG_INTERNAL_STATELESS_CONDITION'
    !(condition) ? (void)0 : ::absl::log_internal::Voidify()&&
      ^~~~~~~~~
In file included from ~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.grpc.pb.cc:5:
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:276:12: error: use of undeclared identifier 'CreateMaybeMessage'
    return CreateMaybeMessage<RerankRequest>(arena);
           ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:493:3: error: unknown type name 'PROTOBUF_ATTRIBUTE_REINITIALIZES'
  PROTOBUF_ATTRIBUTE_REINITIALIZES void Clear() final;
  ^
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:494:30: error: only virtual member functions can be marked 'final'
  bool IsInitialized() const final;
                             ^~~~~
~/git/LocalAI/backend/cpp/llama-avx/llama.cpp/build/examples/grpc-server/backend.pb.h:497:96: error: only virtual member functions can be marked 'final'
  const char* _InternalParse(const char* ptr, ::google::protobuf::internal::ParseContext* ctx) final;
                                                                                               ^~~~~
fatal error: too many errors emitted, stopping now [-ferror-limit=]
7 warnings and 20 errors generated.
make[5]: *** [examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/backend.grpc.pb.cc.o] Error 1
make[4]: *** [examples/grpc-server/CMakeFiles/hw_grpc_proto.dir/all] Error 2
make[3]: *** [all] Error 2
make[2]: *** [grpc-server] Error 2
make[1]: *** [build-llama-cpp-grpc-server] Error 2
make: *** [backend-assets/grpc/llama-cpp-avx] Error 2

So I cannot either run the precompiled released binary. Not built from source. So the problem likely related me updated brew ending up in newer version of the library. And the reason why on mac this issue arises is because I cannot run Docker since Metal is not supported.

@mudler
Copy link
Owner

mudler commented Jun 13, 2024

libs seems to be bundled correctly in #2567 - I did tried to test this by hijacking GHA actions and, while libprotobuf now is bundled, I found out there is still another lib needed (/opt/homebrew/opt/abseil/lib/libabsl_flags_parse.*).

I'm pretty sure I'll find out other missing so will take a while to get there and include all the needed libraries - but would be nice if someone can confirm things working after that PR gets in

@kastakhov
Copy link

When I set the next build variables, I was able to avoid the error from log above (src from tag v2.16.0)...

make build BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_TYPE=metal CMAKE_ARGS="-DLLAMA_F16C=OFF -DLLAMA_AVX512=OFF -DLLAMA_AVX2=OFF -DLLAMA_AVX=OFF -DLLAMA_FMA=OFF -DLLAMA_METAL_EMBED_LIBRARY=ON -DLLAMA_METAL=on"

However, looks like there are some issue with default.metallib compilation/building/copying, llama didn't handle this file correctly.

[100%] Built target q8dot
cp llama.cpp/build/bin/grpc-server .
cp -rfv backend/cpp/llama-fallback/grpc-server backend-assets/grpc/llama-cpp-fallback
backend/cpp/llama-fallback/grpc-server -> backend-assets/grpc/llama-cpp-fallback
cp backend/cpp/llama-fallback/llama.cpp/build/bin/default.metallib backend-assets/grpc/
cp: backend/cpp/llama-fallback/llama.cpp/build/bin/default.metallib: No such file or directory
make: *** [backend-assets/grpc/llama-cpp-fallback] Error 1

@mudler
Copy link
Owner

mudler commented Jun 14, 2024

I tested #2567 on Github action workers and seemed to work fine - took a while to get all the libs but eventually it is working as expected.

Next release is going to be tagged very soon by end of the week, so you can expect a pre-compiled binary shortly.

@qdrddr
Copy link
Author

qdrddr commented Jun 18, 2024

Just to remember:
Set LD_LIBRARY_PATH env virable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request roadmap
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants