Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(worker): use dynaload for single binaries #2620

Merged
merged 1 commit into from
Jun 22, 2024
Merged

fix(worker): use dynaload for single binaries #2620

merged 1 commit into from
Jun 22, 2024

Conversation

mudler
Copy link
Owner

@mudler mudler commented Jun 21, 2024

Description

This PR fixes #2609

We didn't covered starting workers with the shared libs shipped within the binary. Now the main should start the bins as the other gRPC backends

Copy link

netlify bot commented Jun 21, 2024

Deploy Preview for localai ready!

Name Link
🔨 Latest commit 7aed455
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/6675baec73fd3c00088276b6
😎 Deploy Preview https://deploy-preview-2620--localai.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@mudler mudler added the bug Something isn't working label Jun 21, 2024
@mudler mudler force-pushed the fix_worker branch 2 times, most recently from ef2ad1c to c66f542 Compare June 21, 2024 08:50
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
@mudler
Copy link
Owner Author

mudler commented Jun 22, 2024

Works here

@mudler mudler merged commit 8d84dd4 into master Jun 22, 2024
38 checks passed
@mudler mudler deleted the fix_worker branch June 22, 2024 07:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

worker llama-cpp-rpc crash
1 participant