Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use fastchat conversations template #578

Merged
merged 16 commits into from
Sep 27, 2023
Merged

use fastchat conversations template #578

merged 16 commits into from
Sep 27, 2023

Conversation

winglian
Copy link
Collaborator

@winglian winglian commented Sep 15, 2023

use the vicuna_v1.1 Conversations template from the fastchat package. This will make it easier to extend additional conversation types since we can simply load an already registered conversation format template.

Copy link
Collaborator

@NanoCode012 NanoCode012 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a quick look first.

README.md Show resolved Hide resolved
README.md Show resolved Hide resolved
requirements.txt Outdated Show resolved Hide resolved
@winglian winglian merged commit e7d3e2d into main Sep 27, 2023
4 checks passed
@winglian winglian deleted the fastchat-conversations branch September 27, 2023 16:10
@@ -443,6 +443,7 @@ datasets:
data_files: # Optional[str] path to source data files
shards: # Optional[int] number of shards to split data into
name: # Optional[str] name of dataset configuration to load
conversation: # Optional[str] fastchat conversation type, only used with type: sharegpt
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
conversation: # Optional[str] fastchat conversation type, only used with type: sharegpt
conversation: # Optional[str] fastchat conversation type, only used with type: sharegpt. See options: https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py

import fastchat.conversation

fastchat.conversation.Conversation.get_turns = get_turns
fastchat.conversation.Conversation.get_prompt = get_prompt
Copy link
Collaborator

@NanoCode012 NanoCode012 Sep 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I ask what changed to need this patch? I see a lot of similarity Conversation from fschat.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah. Mostly the ability to reuse most of the predefined conversation types defined thwre

@NanoCode012
Copy link
Collaborator

Oh, didn't see it got merged while I was checking it

@kvikk kvikk mentioned this pull request Oct 6, 2023
8 tasks
mkeoliya pushed a commit to mkeoliya/axolotl that referenced this pull request Dec 15, 2023
* use fastchat conversations template

* require fastchat (fschat) pip install

* handle roles dynamically from conversation

* tweak fastchat conversation with a monkeypatch to get individual turns

* fix up so it works with multiple conversation styles, and don't strip the turns

* fix sharegpt fixture now that we're using a more correct tokenization

* use a new prompter and support fastchat conversation type

* use sharegpt from prompt strategies now

* update docs, add chatml template

* add a newline after im_end token

* ensure we correctly set system message

* update per PR feedback to handle deprecated sharegpt types

* don't add duplicate wandb req

* make sharegpt fields configurable from yml

* llama2 fixes

* don't fail fatally when turns are improper
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants