[Required for MPT models] Expose trust_remote_code flag for HF-transformers #1630
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
MPT models are not yet integrated in HF-transformers, and therefore to load them for ONNX export we need to explicitly set
trust_remote_code
flag when loading config and model withAuto
feature fromtransformers
.Without this patch, running
sparseml.transformers.export_onnx ....
on MPT models will crash with:ValueError: Loading 68e1a8e0ebb9b30f3c45c1ef6195980f29063ae2 requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option
trust_remote_code=Trueto remove this error.