Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in Dev Strategy - No Support for LOCAL only models (StarCoder, Llama2, others) for Generative AI models for coding #360

Closed
richlysakowski opened this issue Aug 26, 2023 · 3 comments
Labels
bug Something isn't working

Comments

@richlysakowski
Copy link

There is a bug in the Bug in Development Strategy.

You are wedding everyone to big cloud AI platforms, not democratized LOCAL models. None of the supported models run locally. Who is driving strategy? The AI Cloud Monsters who will insource and consume all IT and AI jobs if we let them.

Please address this issue of a LLM for coding that runs 100% LOCALLY.

Description

I also have a very strong and active interest in running local "offline" high-performing AI "democratized" systems that do not use major cloud platforms, and that are not considered "edge" nodes on the global internet. What people call "The Cloud" is really a small set of large cloud platform vendors CONSPIRING to outsource all IT and computing jobs to their own corporations, leaving millions of citizens without meaningful work. If normal citizens don't outsource "The Cloud" back to "The Edge", entire segments of white collar and brown collar society will be marginalized within the next 10 years. Cloud vendors will OWN, i.e., control, all the jobs they insourced to their cloud business. People on "The Edge" are on the margins of society. Local, democractic AI brings power back to the people. The Cloud takes power away from the people. Let's make sure AI stays democratized, offline and online. Remember "Human Lives Matter" (HLM)! Cloud computing is a tool, not the endpoint for society.

@richlysakowski richlysakowski added the bug Something isn't working label Aug 26, 2023
@JasonWeill
Copy link
Collaborator

@richlysakowski Thanks for your contribution! We have implemented local model support via GPT4All as a resolution to issue #190, as done in #209. Please try it out.

If you find any issues with GPT4All not covered by existing open issues (#348, #226) then we welcome your issues and pull requests.

@richlysakowski
Copy link
Author

Thank you. I will check out what you have done. Can this be used to create completely local implementations of popular LLM models, such as StarCoder or CodeLlama?

@JasonWeill
Copy link
Collaborator

You can find more about GPT4All here: https://gpt4all.io/

They also have their own GitHub repo, with its own issue queue for bugs and enhancements: https://github.com/nomic-ai/gpt4all/issues

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants