Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: _build_causal_attention_mask() missing 1 required positional argument: 'dtype' #2

Open
bigcash opened this issue Sep 14, 2022 · 0 comments

Comments

@bigcash
Copy link

bigcash commented Sep 14, 2022

when i run command: python export_df_onnx.py
got error:
Traceback (most recent call last):
File "export_df_onnx.py", line 136, in
width=512,
File "export_df_onnx.py", line 115, in convert_to_onnx
text_encoder, check_inputs[0], check_inputs=[check_inputs[1]], strict=False
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/jit/_trace.py", line 750, in trace
_module_class,
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/jit/_trace.py", line 965, in trace_module
argument_names,
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1090, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/lingbao/.local/lib/python3.7/site-packages/transformers/models/clip/modeling_clip.py", line 728, in forward
return_dict=return_dict,
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(*input, **kwargs)
File "/home/lingbao/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1090, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/lingbao/.local/lib/python3.7/site-packages/transformers/models/clip/modeling_clip.py", line 637, in forward
causal_attention_mask = self._build_causal_attention_mask(bsz, seq_len).to(hidden_states.device)
TypeError: _build_causal_attention_mask() missing 1 required positional argument: 'dtype'

if i add this argument by this(line 115):
text_encoder, check_inputs[0], check_inputs=[check_inputs[1]],dtype=torch.int32, strict=False
then another error come "TypeError: trace() got an unexpected keyword argument 'dtype'"

maybe conflict version of transformers, my env: transformers==4.19.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant