-
Notifications
You must be signed in to change notification settings - Fork 233
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add gemma-2-9b-it-SimPO and gemma-2-9b-it-DPO to AlpacaEval (#368)
* Add gemma-2-9b-it-SimPO and gemma-2-9b-it-DPO to AlpacaEval * Update configs.yaml * Update configs.yaml * update leaderboard location * fix leaderboard --------- Co-authored-by: Yann Dubois <yanndubois96@gmail.com>
- Loading branch information
1 parent
a80fc97
commit 783c4b5
Showing
13 changed files
with
162,345 additions
and
1 deletion.
There are no files selected for viewing
Large diffs are not rendered by default.
Oops, something went wrong.
Large diffs are not rendered by default.
Oops, something went wrong.
64,513 changes: 64,513 additions & 0 deletions
64,513
results/gemma-2-9b-it-DPO/weighted_alpaca_eval_gpt4_turbo/annotations.json
Large diffs are not rendered by default.
Oops, something went wrong.
187 changes: 187 additions & 0 deletions
187
results/gemma-2-9b-it-DPO/weighted_alpaca_eval_gpt4_turbo/leaderboard.csv
Large diffs are not rendered by default.
Oops, something went wrong.
Large diffs are not rendered by default.
Oops, something went wrong.
4,832 changes: 4,832 additions & 0 deletions
4,832
results/gemma-2-9b-it-SimPO/reference_outputs.json
Large diffs are not rendered by default.
Oops, something went wrong.
78,089 changes: 78,089 additions & 0 deletions
78,089
results/gemma-2-9b-it-SimPO/weighted_alpaca_eval_gpt4_turbo/annotations.json
Large diffs are not rendered by default.
Oops, something went wrong.
188 changes: 188 additions & 0 deletions
188
results/gemma-2-9b-it-SimPO/weighted_alpaca_eval_gpt4_turbo/leaderboard.csv
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16 changes: 16 additions & 0 deletions
16
src/alpaca_eval/models_configs/gemma-2-9b-it-DPO/configs.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
gemma-2-9b-it-DPO: | ||
completions_kwargs: | ||
batch_size: 900 | ||
do_sample: true | ||
max_new_tokens: 4096 | ||
model_kwargs: | ||
torch_dtype: bfloat16 | ||
model_name: princeton-nlp/gemma-2-9b-it-DPO | ||
stop_token_ids: | ||
- 1 | ||
- 107 | ||
temperature: 0.5 | ||
top_p: 1.0 | ||
fn_completions: vllm_local_completions | ||
pretty_name: gemma-2-9b-it-DPO | ||
prompt_template: gemma-2-9b-it-DPO/prompt.txt |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
<bos><start_of_turn>user | ||
{instruction}<end_of_turn> | ||
<start_of_turn>model |
16 changes: 16 additions & 0 deletions
16
src/alpaca_eval/models_configs/gemma-2-9b-it-SimPO/configs.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
gemma-2-9b-it-SimPO: | ||
completions_kwargs: | ||
batch_size: 900 | ||
do_sample: true | ||
max_new_tokens: 4096 | ||
model_kwargs: | ||
torch_dtype: bfloat16 | ||
model_name: princeton-nlp/gemma-2-9b-it-SimPO | ||
stop_token_ids: | ||
- 1 | ||
- 107 | ||
temperature: 0.5 | ||
top_p: 1.0 | ||
fn_completions: vllm_local_completions | ||
pretty_name: gemma-2-9b-it-SimPO | ||
prompt_template: gemma-2-9b-it-DPO/prompt.txt |