Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Group messages with their replies #832

Merged
merged 5 commits into from
Jun 19, 2024
Merged

Conversation

michaelchia
Copy link
Collaborator

Description

Sort the messages displayed in the UI such that replies are repositioned right below the message it is replying to.
Currently, if multiple messages are sent by the user before the reply is given, it is not obvious which reply corresponds to which message.

This change groups the messages such that replies will insert itself below the message it is replying to.

Demo

Screen.Recording.2024-06-13.at.7.42.54.PM.mov

Implementation

Sorting is done by first comparing the index of the message it is replying to (agent) or itself (human or agent without reply_to), after which it would compare its own index. An alternative would be to use the timestamp instead of the index. As far as I can tell, it should be equivalent but if there is some scenario where it could differ, we can switch it to the more correct option.

Sorting is done in the ChatMessages component just before it is rendered. As such, the entire message list is resorted each time it is rendered. I would hope this is a negligible performance overhead as this was the least intrusive way I could think of to implement this solution. Let me know if a more efficient solution is required.

Implications

No guaranteed time ordering

Messages are no longer ordered by their displayed timestamp. An agent reply with a later timestamp can be shown before other messages with an earlier timestamp.

Three different orders of chat history

Previously there were two, 1) chat history in the frontend (which matches the chat history on the server) and 2) the chat history in the DefaultChatHandler langchain memory. Now there is 3) chat history rendered in chat UI.

If all instances of 1) are only to serve 3) then it is probably not an issue. It is just a matter if you are comfortable with relying on that assumption.

There was never an assumption 1-to-1 matching for 1) and 2) anyway since not all messages displayed are in the DefaultChatHandler langchain memory (which is a useful feature to have). However, currently, in the case of the multiple-message-before-reply scenario, it was ambiguous to the user regarding how it would reflect in the langchain memory hence may not have made firm assumptions. With this change, the messages are tied to their replies, and it is now possible that the order of message-reply pairs are completely different from the langchain memory and the user will have no reason to assume otherwise. This is because the langchain memory will only insert human-agent pair when the reply is received hence ordered by the agent reply. Whereas the chat is ordered by when the human message was sent. E.g. if user sends two messages msg1 and msg2 if reply2 comes before reply1, the order in langchain memory would be [msg2, reply2, msg1, reply1], whereas the UI will show [msg1, reply1, msg2, reply2].

I can't think of any good solution to this. Reordering the UI to match the langchain memory would mean human messages could have a different order than they were sent. This seems highly unnatural. Attempting to reorder the langchain memory seems like it would add a lot of extra complexity. I would suggest just living with this possible mismatch quirk and just being aware of it. I would imagine the multiple-message-before-reply scenario is relatively rare anyway.

Remember to self.reply with human_message

When writing ChatHandlers that may reply more than once, it is important to include the optional human_message param in the earlier messages.

async def process_message(self, message):
    self.reply("first reply")
    self.reply("second reply", message)

In the above example, it would display as "second reply" before "first reply" in the UI as "first reply" is not replying to the human message whereas "second reply" is, therefore will display just after the human message and before "first reply".

Copy link
Collaborator

@srdas srdas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice enhancement!

  1. I have checked this branch and it works as intended.
  2. The only nit is that the ordering may vary depending on the ordering in which the response arrives, as noted in the PR. This will impact the conversational memory, which currently is defaulted to two previous chat messages/responses.
  3. Please go ahead and rebase and make sure all checks have passed.

@JasonWeill JasonWeill added the enhancement New feature or request label Jun 17, 2024
Copy link
Member

@dlqqq dlqqq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michaelchia Thank you for your thoughtful implementation of this feature! This is an excellent change to introduce for our users.

I left a couple points of feedback below. In addition, I read through your detailed description and felt it warranted a reply.

No guaranteed time ordering

I think this is fine. Any chat application that supports "threads" (e.g. Discord, Slack, Matrix) also break this constraint.

Three different orders of chat history

Yes, I agree this can be confusing. As you've pointed out, there is already a mismatch between 1) and 2). We ultimately need a unified handling of chat history in the backend, i.e. a single class that manages the LangChain conversation history object along with the sorted list of messages to serve on our ChatHistory API.

Remember to self.reply with human_message

This is a good callout. Perhaps the reply_to field of AgentChatMessage should be made mandatory to prevent this edge case from arising for developers.

packages/jupyter-ai/src/components/chat-messages.tsx Outdated Show resolved Hide resolved
packages/jupyter-ai/src/components/chat-messages.tsx Outdated Show resolved Hide resolved
michaelchia and others added 2 commits June 19, 2024 12:34
Co-authored-by: david qiu <david@qiu.dev>
Copy link
Member

@dlqqq dlqqq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@michaelchia Awesome work! Thank you very much for opening a PR to improve the chat experience. Tested & verified locally, proceeding to merge.

@dlqqq dlqqq merged commit c0db930 into jupyterlab:main Jun 19, 2024
8 checks passed
@michaelchia michaelchia mentioned this pull request Sep 4, 2024
@michaelchia michaelchia deleted the group-replies branch September 4, 2024 05:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants