Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tweak config to work #196

Merged
merged 1 commit into from
Jun 13, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 6 additions & 5 deletions examples/openllama-3b/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,17 +26,18 @@ wandb_watch:
wandb_run_id:
wandb_log_model:
output_dir: ./openllama-out
batch_size: 16
micro_batch_size: 4
gradient_accumulation_steps: 1
micro_batch_size: 1
num_epochs: 3
optimizer: adamw_bnb_8bit
torchdistx_path:
lr_scheduler: cosine
learning_rate: 0.0002
learning_rate: 0.00001
train_on_inputs: false
group_by_length: false
float16: true
bf16: false
fp16: true
fp16: false
tf32: false
gradient_checkpointing: true
early_stopping_patience:
Expand All @@ -52,7 +53,7 @@ eval_steps: 50
save_steps:
debug:
deepspeed:
weight_decay: 0.0
weight_decay: 0.1
fsdp:
fsdp_config:
special_tokens:
Expand Down