Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KeyError: 'last_hidden_state' #1673

Closed
2 of 4 tasks
ZTurboX opened this issue Jan 30, 2024 · 2 comments
Closed
2 of 4 tasks

KeyError: 'last_hidden_state' #1673

ZTurboX opened this issue Jan 30, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@ZTurboX
Copy link

ZTurboX commented Jan 30, 2024

System Info

optimum-1.16.2

Who can help?

@michaelbenayoun
@JingyaHuang
@echarlaix

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

export model and predict code
`
pretrained_model_path = './checkpoints/bge-base-zh'
export_model_path = './checkpoints/onnx'

# model = ORTModelForFeatureExtraction.from_pretrained(pretrained_model_path, from_transformers=True)
tokenizer = AutoTokenizer.from_pretrained(pretrained_model_path)

# optimizer = ORTOptimizer.from_pretrained(model)
# optimization_config = OptimizationConfig(optimization_level=99, optimize_for_gpu=True)  # enable all optimizations
#
# # apply the optimization configuration to the model
# optimizer.optimize(
#     save_dir=export_model_path,
#     optimization_config=optimization_config
# )
# provider = "CUDAExecutionProvider"

model = ORTModelForFeatureExtraction.from_pretrained(export_model_path, file_name="model_optimized.onnx")
vanilla_emb = SentenceEmbeddingPipeline(model=model, tokenizer=tokenizer)
pred = vanilla_emb("你好")
print(pred[0][:5])

`

Expected behavior

use bge-base-zh model than have two errors

  1. onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid Feed Input Name:token_type_ids
  2. File "/mnt/data/work/.conda/envs/torch_env/lib/python3.8/site-packages/optimum/modeling_base.py", line 90, in call
    return self.forward(*args, **kwargs)
    File "/mnt/data/work/.conda/envs/torch_env/lib/python3.8/site-packages/optimum/onnxruntime/modeling_ort.py", line 960, in forward
    last_hidden_state = outputs[self.output_names["last_hidden_state"]]
    KeyError: 'last_hidden_state'
@ZTurboX ZTurboX added the bug Something isn't working label Jan 30, 2024
@satishsilveri
Copy link
Contributor

satishsilveri commented Jan 31, 2024

I fixed the bug and created a PR.

PR: #1674

I tested it locally with the changes on your model and it works.

Full example notebook:
https://github.com/satishsilveri/Semantic-Search/blob/main/Optimize_SBERT/BAAI_bge_base_zh.ipynb

@loretoparisi
Copy link

@fxmarty

If I update to optimum==1.21.2 from optimum==1.16.2 I get back the old error:

    from optimum.onnxruntime import ORTModelForFeatureExtraction
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/home/coder/.local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1550, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/coder/.local/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1562, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import optimum.onnxruntime.modeling_ort because of the following error (look up to see its traceback):
Failed to import optimum.exporters.onnx.__main__ because of the following error (look up to see its traceback):
cannot import name 'is_torch_less_than_1_11' from 'transformers.pytorch_utils' (/home/coder/.local/lib/python3.8/site-packages/transformers/pytorch_utils.py)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants