Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: No module named 'llama_index.core.llms.llama_utils' #10730

Closed
msamwelmollel opened this issue Feb 14, 2024 · 2 comments
Closed

[Bug]: No module named 'llama_index.core.llms.llama_utils' #10730

msamwelmollel opened this issue Feb 14, 2024 · 2 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@msamwelmollel
Copy link

Bug Description

In v0.10.3 whenever I call
from llama_index.core.llms.llama_utils import (
messages_to_prompt,
completion_to_prompt,
)

I got the erro that module doesn't exist

Version

0.10.3

Steps to Reproduce

from llama_index.core import SimpleDirectoryReader, VectorStoreIndex
from llama_index.llms.llama_cpp import LlamaCPP
from llama_index.core.llms.llama_utils import (
messages_to_prompt,
completion_to_prompt,
)

Relevant Logs/Tracbacks

from llama_index.core.llms.llama_utils import (
ModuleNotFoundError: No module named 'llama_index.core.llms.llama_utils'
@msamwelmollel msamwelmollel added bug Something isn't working triage Issue needs to be triaged/prioritized labels Feb 14, 2024
Copy link

dosubot bot commented Feb 14, 2024

🤖

Hey @msamwelmollel, nice to see you back in the labyrinth of code! 🌀

The error "ModuleNotFoundError: No module named 'llama_index.core.llms.llama_utils'" is occurring because with the introduction of LlamaIndex v0.10.0, many imports have changed.

If you want to import specific functions from 'llama_index.core.llms.llama_utils', you have two options:

  1. Use temporary legacy imports: You can update your imports to use the legacy import package. This is a large change, but it allows existing code to migrate to v0.10.0 with minimal impact. For example, you can update your imports from:
from llama_index.core.llms.llama_utils import (
    messages_to_prompt,
    completion_to_prompt,
)

to:

from llama_index.legacy.core.llms.llama_utils import (
    messages_to_prompt,
    completion_to_prompt,
)
  1. Full migration: You can use the command-line tool that comes with pip install llama-index and pip install llama-index-core to update existing code and notebooks. After installing v0.10.0, you can upgrade your existing imports automatically:
llamaindex-cli upgrade-file <file_path>
# OR
llamaindex-cli upgrade <folder_path>

This will update the import statements in your .py and .md files, and new requirements will be printed to the terminal. For notebooks, new pip install statements are inserted and imports are updated.

For more details, you can refer to the migration guide for v0.10.0.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@logan-markewich
Copy link
Collaborator

@msamwelmollel

pip install llama-index-llms-llama-cpp (looks like you already did this)
from llama_index.llms.llama_cpp.llama_utils import messages_to_prompt, completion_to_prompt

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants