We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fine turning phi with qlora.yaml it should work
getting this stack trace
[2023-10-02 05:40:42,261] [INFO] [axolotl.normalize_config:120] [PID:213973] [RANK:0] GPU memory usage baseline: 0.000GB (+0.610GB misc) [2023-10-02 05:40:45,125] [DEBUG] [axolotl.load_tokenizer:75] [PID:213973] [RANK:0] EOS: 50256 / <|endoftext|> [2023-10-02 05:40:45,125] [DEBUG] [axolotl.load_tokenizer:76] [PID:213973] [RANK:0] BOS: 50256 / <|endoftext|> [2023-10-02 05:40:45,126] [DEBUG] [axolotl.load_tokenizer:77] [PID:213973] [RANK:0] PAD: None / None [2023-10-02 05:40:45,126] [DEBUG] [axolotl.load_tokenizer:78] [PID:213973] [RANK:0] UNK: 50256 / <|endoftext|> [2023-10-02 05:40:45,840] [INFO] [axolotl.load_tokenized_prepared_datasets:126] [PID:213973] [RANK:0] Loading prepared dataset from disk at last_run_prepared/adc6b39be9226d31a1783d546e83a96e... [2023-10-02 05:40:46,217] [INFO] [axolotl.load_tokenized_prepared_datasets:128] [PID:213973] [RANK:0] Prepared dataset loaded from disk... [2023-10-02 05:40:46,520] [INFO] [axolotl.calculate_total_num_steps:526] [PID:213973] [RANK:0] total_num_steps: 3644 [2023-10-02 05:40:46,528] [INFO] [axolotl.train.train:48] [PID:213973] [RANK:0] loading tokenizer... microsoft/phi-1_5 [2023-10-02 05:40:48,649] [DEBUG] [axolotl.load_tokenizer:75] [PID:213973] [RANK:0] EOS: 50256 / <|endoftext|> [2023-10-02 05:40:48,650] [DEBUG] [axolotl.load_tokenizer:76] [PID:213973] [RANK:0] BOS: 50256 / <|endoftext|> [2023-10-02 05:40:48,650] [DEBUG] [axolotl.load_tokenizer:77] [PID:213973] [RANK:0] PAD: None / None [2023-10-02 05:40:48,650] [DEBUG] [axolotl.load_tokenizer:78] [PID:213973] [RANK:0] UNK: 50256 / <|endoftext|> [2023-10-02 05:40:48,650] [INFO] [axolotl.train.train:56] [PID:213973] [RANK:0] loading model and (optionally) peft_config... Traceback (most recent call last): File "/opt/conda/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/opt/conda/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/home/gcpuser/axolotl/src/axolotl/cli/train.py", line 38, in <module> fire.Fire(do_cli) File "/opt/conda/lib/python3.10/site-packages/fire/core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "/opt/conda/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire component, remaining_args = _CallAndUpdateTrace( File "/opt/conda/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace component = fn(*varargs, **kwargs) File "/home/gcpuser/axolotl/src/axolotl/cli/train.py", line 34, in do_cli train(cfg=parsed_cfg, cli_args=parsed_cli_args, dataset_meta=dataset_meta) File "/home/gcpuser/axolotl/src/axolotl/train.py", line 57, in train model, peft_config = load_model(cfg, tokenizer, inference=cli_args.inference) File "/home/gcpuser/axolotl/src/axolotl/utils/models.py", line 187, in load_model modeling_phi = importlib.import_module(module_name) File "/opt/conda/lib/python3.10/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 1050, in _gcd_import File "<frozen importlib._bootstrap>", line 1027, in _find_and_load File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked ModuleNotFoundError: No module named 'transformers_modules.microsoft.phi-1_5.b6a7e2fe15c21f5847279f23e280cc5a0e7049ef.modeling_mixformer_sequential'
config file used
config_phi = """ base_model: microsoft/phi-1_5 base_model_config: microsoft/phi-1_5 model_type: AutoModelForCausalLM tokenizer_type: AutoTokenizer is_llama_derived_model: false trust_remote_code: true load_in_8bit: false load_in_4bit: true strict: false datasets: - path: manishiitg/aditi-gpt4-instruct-v2 type: completion dataset_prepared_path: last_run_prepared val_set_size: 0.05 output_dir: ./phi-sft-out sequence_len: 1024 sample_packing: false # not CURRENTLY compatible with LoRAs pad_to_sequence_len: adapter: qlora lora_model_dir: lora_r: 64 lora_alpha: 32 lora_dropout: 0.05 lora_target_linear: true lora_fan_in_fan_out: wandb_project: wandb_entity: wandb_watch: wandb_run_id: wandb_log_model: gradient_accumulation_steps: 1 micro_batch_size: 20 num_epochs: 4 optimizer: adamw_torch adam_beta2: 0.95 adam_epsilon: 0.00001 max_grad_norm: 1.0 lr_scheduler: cosine learning_rate: 0.000003 train_on_inputs: false group_by_length: true bf16: true fp16: false tf32: true gradient_checkpointing: early_stopping_patience: resume_from_checkpoint: local_rank: logging_steps: 1 xformers_attention: flash_attention: warmup_steps: 100 eval_steps: 0.05 save_steps: debug: deepspeed: weight_decay: 0.1 fsdp: fsdp_config: resize_token_embeddings_to_32x: true special_tokens: bos_token: "<|endoftext|>" eos_token: "<|endoftext|>" unk_token: "<|endoftext|>" pad_token: "<|endoftext|>" """
and command
!python -m axolotl.cli.train /sky-notebook/lora.yaml #phi
No response
3.10
main
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
Please check that this issue hasn't been reported before.
Expected Behavior
fine turning phi with qlora.yaml it should work
Current behaviour
getting this stack trace
Steps to reproduce
config file used
and command
Possible solution
No response
Which Operating Systems are you using?
Python Version
3.10
axolotl branch-commit
main
Acknowledgements
The text was updated successfully, but these errors were encountered: