Skip to content

Commit

Permalink
remove old logging, update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
winglian committed Nov 15, 2023
1 parent 8dbd7cc commit 1defaf2
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 7 deletions.
12 changes: 8 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -460,6 +460,14 @@ is_llama_derived_model:
# Please note that if you set this to true, `padding_side` will be set to "left" by default
is_mistral_derived_model:

# optional overrides to the base model configuration
model_config:
# RoPE Scaling https://github.com/huggingface/transformers/pull/24653
rope_scaling:
type: # linear | dynamic
factor: # float


# Whether you are training a 4-bit GPTQ quantized model
gptq: true
gptq_groupsize: 128 # group size
Expand Down Expand Up @@ -726,10 +734,6 @@ landmark_attention:
# xpos RoPE see https://github.com/kaiokendev/cutoff-len-is-context-len/blob/main/util/xpos_rope_llama_monkey_patch.py
# LLaMA only
xpos_rope:
# RoPE Scaling https://github.com/huggingface/transformers/pull/24653
rope_scaling:
type: # linear | dynamic
factor: # float

# Resume from a specific checkpoint dir
resume_from_checkpoint:
Expand Down
3 changes: 0 additions & 3 deletions src/axolotl/utils/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -350,9 +350,6 @@ def load_model(
**model_kwargs,
)
except Exception as err: # pylint: disable=broad-exception-caught
LOG.error(
"Exception raised attempting to load model, retrying with AutoModelForCausalLM"
)
LOG.exception(err)
raise err

Expand Down

0 comments on commit 1defaf2

Please sign in to comment.