Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow usage without NVIDIA partner package #622

Merged
merged 1 commit into from
Feb 6, 2024

Conversation

dlqqq
Copy link
Member

@dlqqq dlqqq commented Feb 6, 2024

Issue

#579 added a change that imports the partner package langchain_nvidia_ai_endpoints directly, which causes an ImportError to be raised when importing jupyter_ai without the partner package installed in the same environment.

[W 2024-02-06 06:22:34.590 ServerApp] jupyter_ai | error adding extension (enabled: True): The module 'jupyter_ai' could not be found (No module named 'langchain_nvidia_ai_endpoints'). Are you sure the extension is installed?
    Traceback (most recent call last):
      File "/Users/dlq/micromamba/envs/jai/lib/python3.11/site-packages/jupyter_server/extension/manager.py", line 322, in add_extension
        extpkg = ExtensionPackage(name=extension_name, enabled=enabled)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/Users/dlq/micromamba/envs/jai/lib/python3.11/site-packages/jupyter_server/extension/manager.py", line 186, in __init__
        self._load_metadata()
      File "/Users/dlq/micromamba/envs/jai/lib/python3.11/site-packages/jupyter_server/extension/manager.py", line 201, in _load_metadata
        raise ExtensionModuleNotFound(msg) from None
    jupyter_server.extension.utils.ExtensionModuleNotFound: The module 'jupyter_ai' could not be found (No module named 'langchain_nvidia_ai_endpoints'). Are you sure the extension is installed?

PR description

This PR fixes that issue by defining the NVIDIA provider in an isolated module that is not imported by anything else in jupyter_ai_magics. This allows that module to import from langchain_nvidia_ai_endpoints directly. This branch also catches any ImportError raised while loading the entry points and prints a short helpful warning to the terminal:

[W 2024-02-06 07:16:40.368 AiExtension] Unable to load model provider `nvidia-chat`. Please install the `langchain_nvidia_ai_endpoints` package.

Reviewing this PR

You will need to re-install the package, then test both cases:

jlpm dev-uninstall && jlpm dev-install
jupyter lab # test case with partner package installed
pip uninstall langchain_nvidia_ai_endpoints
jupyter lab # test case with partner package uninstalled

Callout for future work

One issue with this PR is that it breaks our provider convention where everything is re-exposed at the top-level package. That is, the statement

from jupyter_ai_magics import <provider>

works for AI21Provider, but fails for ChatNVIDIAProvider.

In the future, I actually would like to revert our convention of exposing all the providers at the package root. We only do so now because I thought that entry points could only be exposed from the package root, which is untrue. This effort would require first removing all of these imports from packages/jupyter-ai-magics/jupyter_ai_magics/__init__.py:

# expose model providers on the package root
from .providers import (
    AI21Provider,
    AnthropicProvider,
    AzureChatOpenAIProvider,
    ...
)

Then, editing each entry point definition from:

ai21 = "jupyter_ai_magics:AI21Provider"

To a definition that specifies the source module directly:

ai21 = "jupyter_ai_magics.providers:AI21Provider"

@dlqqq dlqqq added the bug Something isn't working label Feb 6, 2024
Copy link
Collaborator

@3coins 3coins left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dlqqq Great job on finding a fix for this issue. Verified both cases work as expected, was able to launch JLab and work with JAI. Code looks good!

With partner provider Install

[I 2024-02-06 08:57:11.719 AiExtension] Registered model provider `nvidia-chat`.

Without partner provider install

[W 2024-02-06 08:58:30.219 AiExtension] Unable to load model provider `nvidia-chat`. Please install the `langchain_nvidia_ai_endpoints` package.

@JasonWeill
Copy link
Collaborator

Per discussion, let's use the term community_providers to refer to NVIDIA and other newly-added providers. The term partner implies a closer relationship than we actually have.

@dlqqq
Copy link
Member Author

dlqqq commented Feb 6, 2024

The term "partners" is in reference to LangChain partner packages. As this term appears only in a single source directory and not in any user-facing strings, I think this is a non-issue. Proceeding to merge.

@dlqqq dlqqq merged commit 6aeb87f into jupyterlab:main Feb 6, 2024
10 checks passed
@dlqqq
Copy link
Member Author

dlqqq commented Feb 6, 2024

@meeseeksdev please backport to 1.x

meeseeksmachine pushed a commit to meeseeksmachine/jupyter-ai that referenced this pull request Feb 6, 2024
dlqqq added a commit that referenced this pull request Feb 6, 2024
@dlqqq dlqqq mentioned this pull request Mar 5, 2024
dbelgrod pushed a commit to dbelgrod/jupyter-ai that referenced this pull request Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants