Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make help message template configurable #938

Merged
merged 5 commits into from
Aug 17, 2024
Merged

Conversation

dlqqq
Copy link
Member

@dlqqq dlqqq commented Aug 6, 2024

Description

  • Makes the help message template configurable via traitlets.
  • Performs some fairly significant refactoring. The method for sending a help message has been moved from HelpChatHandler to the BaseChatHandler base class.
    • The reason for doing this is that /clear also needs to be able to send a help message. Since this functionality is shared by multiple chat handlers, it seemed like better practice to implement this exactly once in a base class and allow any chat handler to send a help message.
  • Fixes Make chat help message configurable #932

Demo

Screenshot 2024-08-06 at 2 49 46 PM

Testing

To reproduce the above demo:

  1. Create a new config.py file in your current directory with the contents:
c.AiExtension.help_message_template = """
Sup. I'm {persona_name}. This is a sassy custom help message.

Here's the slash commands you can use. Use 'em or don't... I don't care.

{slash_commands_list}
""".strip()
  1. Start JupyterLab via jupyter lab --config=config.py.

Additional notes

I've added a new TestProviderAskLearnUnsupported class in the jupyter_ai_test package for local testing. You can verify that after switching to this model in the settings and running /clear, /ask and /learn do not show up in the help message as they are listed in unsupported_slash_commands.

  • We do not automatically regenerate the help message when switching between different LLMs, even if they differ in the slash commands that they support. This is a known issue.

@dlqqq dlqqq added the enhancement New feature or request label Aug 6, 2024
@dlqqq dlqqq force-pushed the config-help branch 2 times, most recently from d404cf2 to 7ea37d5 Compare August 6, 2024 23:17
@michaelchia
Copy link
Collaborator

michaelchia commented Aug 7, 2024

hello, since you are making this change, it seems like perhaps we can resolve #851 by splitting the welcome and help message. I think it would make more sense to make the welcome message configurable via this traitlet while having the help chat handler define the help message. what do you think?

@dlqqq
Copy link
Member Author

dlqqq commented Aug 7, 2024

@michaelchia I agree that we should split the welcome message and the help message in the future! This PR doesn't exclude that possibility; it's merely making the help message configurable.

@krassowski
Copy link
Member

I think this also fixes #927

Copy link
Collaborator

@srdas srdas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. Reviewed code, LGTM.
  2. Also tested the branch and it works as expected.
  3. I will add documentation for this. Can be added around the beginning of the section "The chat interface" which also currently does not include the image of the slash commands.

dlqqq and others added 5 commits August 16, 2024 13:29
Updated users `index.md` to explain how to create a custom `config.py` to prepare a custom help message.
Copy link
Collaborator

@JasonWeill JasonWeill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks @srdas for adding docs to @dlqqq 's great change!

@srdas srdas merged commit 83e368b into jupyterlab:main Aug 17, 2024
8 checks passed
@dlqqq dlqqq deleted the config-help branch August 26, 2024 21:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make chat help message configurable Persona identity is not respected in prompt templates
5 participants