Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'NoneType' object has no attribute 'shape' #35

Open
thistleknot opened this issue Apr 12, 2024 · 7 comments
Open

AttributeError: 'NoneType' object has no attribute 'shape' #35

thistleknot opened this issue Apr 12, 2024 · 7 comments

Comments

@thistleknot
Copy link

(textgen) [root@pve-m7330 sparsegpt]# python llama.py ../text-generation-webui/models/TinyLlama-1.1B-Chat-v1.0/ wikitext2 --nsamples 10
Token indices sequence length is longer than the specified maximum sequence length for this model (2824491 > 2048). Running this sequence through the model will result in indexing errors
Token indices sequence length is longer than the specified maximum sequence length for this model (2824491 > 2048). Running this sequence through the model will result in indexing errors
Dataset: wikitext2
Evaluating ...
0
Traceback (most recent call last):
  File "/home/user/sparsegpt/llama.py", line 335, in <module>
    llama_eval(model, testloader, DEV, dataset, args.log_wandb)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/user/sparsegpt/llama.py", line 211, in llama_eval
    outs[j] = layer(inps[j].unsqueeze(0), attention_mask=attention_mask)[0]
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 739, in forward
    hidden_states, self_attn_weights, present_key_value = self.self_attn(
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 644, in forward
    cos, sin = self.rotary_emb(value_states, position_ids)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/home/user/miniconda3/envs/textgen/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 134, in forward
    inv_freq_expanded = self.inv_freq[None, :, None].float().expand(position_ids.shape[0], -1, 1)
AttributeError: 'NoneType' object has no attribute 'shape'
(textgen) [root@pve-m7330 sparsegpt]#

@phind-justin
Copy link

same

@phind-justin
Copy link

wondering if this is a transformers version issue

@algorithmexplorer
Copy link

outs[j] = layer(inps[j].unsqueeze(0), attention_mask=attention_mask)[0]
这一行代码没有加位置编码。更改之后是这样的
outs[j] = layer(inps[j].unsqueeze(0), attention_mask=attention_mask, position_ids=cache_position)[0]
@phind-justin @thistleknot

@hktk07
Copy link

hktk07 commented May 21, 2024

I have the same question and I just use the code "
model = AutoModelForCausalLM.from_pretrained('llama2')
c_loss, m_loss = model(examples, labels, observations)",and it all make mistake in library function
只有“c_loss, m_loss = model(examples, labels, observations)”是自己写的代码报错,其他都是库函数报错

@SHUSHENGQIGUI
Copy link

wondering if this is a transformers version issue

hi you can refer the code fo wanda: https://github.com/locuslab/wanda/blob/main/lib/prune.py,modify function of llama.py as prune_sparsegpt()

@SHUSHENGQIGUI
Copy link

hi you can refer the code fo wanda: https://github.com/locuslab/wanda/blob/main/lib/prune.py,modify function of llama.py as prune_sparsegpt()

hi you can refer the code fo wanda: https://github.com/locuslab/wanda/blob/main/lib/prune.py , modify function of llama.py as prune_sparsegpt()

@time-less-ness
Copy link

time-less-ness commented Jun 17, 2024

I get this same error trying locally.

(venv)$ python llama.py  /data/Llama3-ChatQA-1.5-8B   c4 --sparsity 0.5
...
  File "sparsegpt/venv/lib/python3.11/site-packages/transformers/models/llama/modeling_llama.py", line 110, in forward
    inv_freq_expanded = self.inv_freq[None, :, None].float().expand(position_ids.shape[0], -1, 1)
                                                                    ^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'shape'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants