Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mistral Support #26

Open
fakerybakery opened this issue Dec 31, 2023 · 2 comments
Open

Mistral Support #26

fakerybakery opened this issue Dec 31, 2023 · 2 comments

Comments

@fakerybakery
Copy link

Hi,
Does this project support Mistral?
Thanks!

@pprp
Copy link

pprp commented Jan 13, 2024

No actually, I got error:

Traceback (most recent call last):
  File "main.py", line 118, in <module>
    main()
  File "main.py", line 90, in main
    prune_sparsegpt(args, model, tokenizer, device, prune_n=prune_n, prune_m=prune_m, layer_no=idx)
  File "/home/user/miniconda3/envs/py38/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/data2/user/workspace/PRUNE/GBLM-Pruner/lib/prune.py", line 513, in prune_sparsegpt
    gpts[name].fasterprune(args.sparsity_ratio, prune_n=prune_n, prune_m=prune_m, percdamp=0.01, blocksize=128)
  File "/data2/user/workspace/PRUNE/GBLM-Pruner/lib/sparsegpt.py", line 66, in fasterprune
    H = torch.linalg.cholesky(H)
torch._C._LinAlgError: linalg.cholesky: The factorization could not be completed because the input is not positive-definite (the leading minor of order 1 is not positive-definite).

@bhavyashahh
Copy link

Try to increase the dampening factor (cdamp) to avoid this non positive definite issue of the Hession matrix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants