Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: No module named 'llama_index.core.llms.generic_utils' #11071

Closed
MudassirAqeelAhmed opened this issue Feb 21, 2024 · 18 comments
Closed

[Bug]: No module named 'llama_index.core.llms.generic_utils' #11071

MudassirAqeelAhmed opened this issue Feb 21, 2024 · 18 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@MudassirAqeelAhmed
Copy link

Bug Description

I installed llama-index on google colab notebook.
!pip install llama-index-embeddings-anyscale
!pip install -U llama-index llama-index-core llama-index-llms-openai

I'm trying to import

from llama_index.embeddings.anyscale import AnyscaleEmbedding

I'm getting this error

ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'

Version

llama-index==0.10.10

Steps to Reproduce

Just install llama-index, and llama-index-embeddings-anyscale on colab and import

from llama_index.embeddings.anyscale import AnyscaleEmbedding

Relevant Logs/Tracbacks

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
[<ipython-input-7-6245d537c86e>](https://localhost:8080/#) in <cell line: 21>()
     19 from langchain_core.documents import Document
     20 from langchain.vectorstores.utils import maximal_marginal_relevance
---> 21 from llama_index.embeddings.anyscale import AnyscaleEmbedding
     22 from itertools import islice
     23 from typing import Iterable, Iterator, List, TypeVar, Dict, Tuple, Union, Optional

2 frames
[/usr/local/lib/python3.10/dist-packages/llama_index/embeddings/anyscale/utils.py](https://localhost:8080/#) in <module>
      1 from typing import Optional, Tuple
      2 
----> 3 from llama_index.core.llms.generic_utils import get_from_param_or_env
      4 
      5 DEFAULT_ANYSCALE_API_BASE = "https://api.endpoints.anyscale.com/v1"

ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------
@MudassirAqeelAhmed MudassirAqeelAhmed added bug Something isn't working triage Issue needs to be triaged/prioritized labels Feb 21, 2024
Copy link

dosubot bot commented Feb 21, 2024

🤖

Hey @MudassirAqeelAhmed! Good to see you back here. Hope you're doing well.

Based on the error message you provided, it seems like the module 'llama_index.core.llms.generic_utils' is not found in your Python environment. This could be due to a few reasons:

  1. The module 'llama_index.core.llms.generic_utils' does not exist in the LlamaIndex package. This could be a bug in the package or the module might have been removed in a recent update.

  2. The LlamaIndex package is not properly installed in your Python environment. You can try reinstalling the package using pip:

!pip uninstall llama-index
!pip install llama-index
  1. The LlamaIndex package is not compatible with the Python version you are using. You can check the Python version compatibility of the LlamaIndex package.

  2. The LlamaIndex package might have dependencies that are not installed in your Python environment. You can check the package documentation for any dependencies and install them.

However, without more information about the LlamaIndex package and its dependencies, it's hard to provide a definitive solution. Please check the package documentation or contact the package maintainers for more information.

In the meantime, I found some similar issues in the LlamaIndex repository that might be helpful:

  1. [Bug]: Getting module not found error for llama_index.core.llms.generic_utils which was last updated on February 21, 2024. The suggested solution is to wait for about 10 minutes and then run pip install -U llama-index-core llama-index-llms-openai llama-index-embeddings-openai.

  2. [Bug]: No module named 'llama_index.core.llms.llama_utils' which was last updated on February 14, 2024. The suggested solution is to use temporary legacy imports or use the command-line tool to update existing code and notebooks after installing v0.10.0.

  3. [Bug]: langchain.embeddings.base not found which was last updated on September 18, 2023. This issue was closed by a pull request: fix and bump langchain dependencies on September 18, 2023.

I hope this helps! If you have any other questions or need further clarification, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@hkrsmk
Copy link
Contributor

hkrsmk commented Feb 21, 2024

When I tried this, issue has been fixed in llama-index-core but the code hasn't changed in the actual package itself when I checked the filepath. Might require updating of pyproject.toml but it's gonna be a pain. You can try running it as a py script and manually changing for now

@logan-markewich
Copy link
Collaborator

@MudassirAqeelAhmed also update the openai embeddings -- pip install -U llama-index-embeddings-openai

@MudassirAqeelAhmed
Copy link
Author

image

@MudassirAqeelAhmed also update the openai embeddings -- pip install -U llama-index-embeddings-openai

@ravi03071991
Copy link
Contributor

pip uninstall llama-index  # remove any global
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index llama-index-embeddings-openai

@MudassirAqeelAhmed can you try this?

@MudassirAqeelAhmed
Copy link
Author

I'm running this on notebook.

@ravi03071991
Copy link
Contributor

pip uninstall llama-index  # remove any global
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index llama-index-embeddings-openai
pip install ipykernel
python -m ipykernel install --user --name=my_venv --display-name="my_venv"

and then select my_venv kernal in your notebook

@HarmeetSingh07
Copy link

HarmeetSingh07 commented Feb 22, 2024

@MudassirAqeelAhmed I had this same problem and it was solved by:
pip3 install --upgrade llama-index-core

@urigott
Copy link

urigott commented Feb 23, 2024

I managed to solve this problem by copying generic_utils.py from llama_index/core/base/llms into llama_index/core/llms

@logan-markewich
Copy link
Collaborator

Or you could have installed in a fresh env /upgraded your deps 😀 👍🏻

@urigott
Copy link

urigott commented Feb 24, 2024

Or you could have installed in a fresh env /upgraded your deps 😀 👍🏻

For some reason it didn't work for me 🤷

@manjunathshiva
Copy link

I managed to solve this problem by copying generic_utils.py from llama_index/core/base/llms into llama_index/core/llms

Thanks you! This worked for me as well! No other methods mentioned worked! Thank you!

@PaulBFB
Copy link

PaulBFB commented Feb 26, 2024

I managed to solve this problem by copying generic_utils.py from llama_index/core/base/llms into llama_index/core/llms

I had the exact same issue, this worked for me as well, thank you!

@ycd
Copy link

ycd commented Mar 6, 2024

Pretty much every integration I try to use with LLamaIndex ends up being problematic 😭

@logan-markewich
Copy link
Collaborator

logan-markewich commented Mar 6, 2024

@ycd probably more productive to share the issues and get a resolution :)

If you are migrating from v0.9.x it's really recommended to start with a fresh venv

@ycd
Copy link

ycd commented Mar 7, 2024

@logan-markewich to be fair I was just yapping, solved it already but to be more specific and add a little bit context; I encountered like 3 integrations/projects that has obsolete documentation that does not 'just works', I even ended up monkey-patching some of them to get it working(not this one).

EDIT: I don't remember the exact issue but some of the out of date docs were external projects that provides an integration with LLamaIndex (e.g Chainlit)

@ycd
Copy link

ycd commented Mar 7, 2024

Okay, I was using llama-index-core=0.10.5,I added Anthropic provider to my project, it raised also this error.

    from llama_index.llms.anthropic.base import Anthropic
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/llms/anthropic/base.py", line 21, in <module>
    from llama_index.core.base.llms.generic_utils import (
ModuleNotFoundError: No module named 'llama_index.core.base.llms.generic_utils'

This time, I decided give your suggestion a try, bombed my env, created a new one, and there it is, my env is conflicting now.

The conflict is caused by:
    The user requested llama-index-core==0.10.3
    llama-index 0.10.5 depends on llama-index-core<0.11.0 and >=0.10.0
    llama-index-agent-openai 0.1.1 depends on llama-index-core<0.11.0 and >=0.10.1
    llama-index-callbacks-langfuse 0.1.2 depends on llama-index-core<0.11.0 and >=0.10.8

Okay, great, let's upgrade to 0.10.8:

    from llama_index.core.prompts.base import (
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/prompts/base.py", line 37, in <module>
    from llama_index.core.llms.base import BaseLLM
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/__init__.py", line 12, in <module>
    from llama_index.core.llms.custom import CustomLLM
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/custom.py", line 19, in <module>
    from llama_index.core.llms.llm import LLM
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/core/llms/llm.py", line 43, in <module>
    from llama_index.core.prompts import BasePromptTemplate, PromptTemplate
ImportError: cannot import name 'BasePromptTemplate' from partially initialized module 'llama_index.core.prompts' (most likely due to a circular import)

Then, I got circular import error, found your #11032, then decided to upgrade 0.10.15, which I one by one resolved conflicts like the following:

ERROR: Cannot install -r requirements.txt (line 70) and llama-index-agent-openai==0.1.1 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested llama-index-agent-openai==0.1.1
    llama-index 0.10.15 depends on llama-index-agent-openai<0.2.0 and >=0.1.4

It's fine I can do that, which I resolved all conflicts, then even in 0.10.15, it still hits me up with this:

    from llama_index.llms.gemini.base import Gemini
  File "/Users/yagu/wope/castor/env/lib/python3.11/site-packages/llama_index/llms/gemini/base.py", line 20, in <module>
    from llama_index.core.utilities.gemini_utils import (
ModuleNotFoundError: No module named 'llama_index.core.utilities.gemini_utils'

@logan-markewich
Copy link
Collaborator

logan-markewich commented Mar 7, 2024

@ycd what are your project reqs? It seems like at this point I would just start a fresh venv with latest versions of things

And yea, sadly we can't control docs from people that use the llama-index package 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

9 participants