Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AssertionError: _call_model must return a list #760

Closed
brunomcuesta opened this issue Jun 28, 2024 · 5 comments
Closed

AssertionError: _call_model must return a list #760

brunomcuesta opened this issue Jun 28, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@brunomcuesta
Copy link

brunomcuesta commented Jun 28, 2024

Hello!
I'm trying to scan an LLM in Ollama, but garak is unable to deserialize the json returned by the API.

Testing the connection to the Ollama and LLM llama3 endpoint.

curl -v https://<IP>/api/generate -H 'Content-Type: application/json' -d '{"model": "llama3","prompt": "Hello!","stream": false}'

the API returns the data in json successfully.

{"model":"llama3","created_at":"2024-06-27T19:33:28.276246646Z","response":"Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?","done":true,"done_reason":"stop","context":[128006,882,128007,271,9906,0,128009,128006,78191,128007,271,9906,0,1102,596,6555,311,3449,499,13,2209,1070,2555,358,649,1520,499,449,11,477,1053,499,1093,311,6369,30,128009],"total_duration":16211714695,"load_duration":10832429646,"prompt_eval_count":12,"prompt_eval_duration":1136599000,"eval_count":26,"eval_duration":4196455000}

Running a scan with garak on the same endpoint.

config.json file:

{
  "rest.RestGenerator": {
  "name": "llama3",
  "uri": "https://<IP>/api/generate",
  "method": "post",
  "headers": {
    "Content-Type": "application/json"
  },
  "req_template_json_object": {
    "model": "llama3",
    "stream": false,
    "prompt": "$INPUT"
  },
  "response_json": true,
  "response_json_field": "response"
}
}

Command:

garak --model_type rest -G config.json --probes xss

Traceback:

Traceback (most recent call last):                                                                  
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/bin/garak", line 8, in <module>                                                                  sys.exit(main())
             ^^^^^^
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/__main__.py", line 9, in main
    cli.main(sys.argv[1:])
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/cli.py", line 486, in main
    command.probewise_run(generator, probe_names, evaluator, buff_names)
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/command.py", line 212, in probewise_run
    probewise_h.run(generator, probe_names, evaluator, buffs)
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/harnesses/probewise.py", line 106, in run
    h.run(model, [probe], detectors, evaluator, announce_probe=False)
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/harnesses/base.py", line 93, in run
    attempt_results = probe.probe(model)
                      ^^^^^^^^^^^^^^^^^^
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/probes/base.py", line 204, in probe
    attempts_completed = self._execute_all(attempts_todo)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/probes/base.py", line 182, in _execute_all
    result = self._execute_attempt(this_attempt)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/probes/base.py", line 145, in _execute_attempt
    this_attempt.outputs = self.generator.generate(this_attempt.prompt)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/bruno/Researches/LLMAttacks/tools/garak-pip/garak_env/lib/python3.11/site-packages/garak/generators/base.py", line 130, in generate
    assert isinstance(
           ^^^^^^^^^^^
AssertionError: _call_model must return a list

I tried with the stream set to true, but it didn't work either.

According to the log file, it appears that garak is unable to deserialize the json. By curl the json looks normal, but by garak it looks like it is returning the json as a byte string.

2024-06-27 15:49:36,977  WARNING  REST endpoint didn't return good JSON Extra data: line 2 column 1 (char 92): got |b'{"model":"llama3","created_at":"2024-06-27T18:49:41.94885093Z","respon
se":"I","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:42.299192867Z","response":" cannot","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:42.450789532Z","
response":" provide","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:42.589473052Z","response":" information","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:4
9:42.779138117Z","response":" that","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:42.968582691Z","response":" could","done":false}\n{"model":"llama3","created_at":"2024-0
6-27T18:49:43.150194757Z","response":" be","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:43.302253291Z","response":" used","done":false}\n{"model":"llama3","created_at":"
2024-06-27T18:49:43.47957356Z","response":" to","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:43.648461074Z","response":" gain","done":false}\n{"model":"llama3","created_
at":"2024-06-27T18:49:43.789658523Z","response":" unauthorized","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:43.93333191Z","response":" access","done":false}\n{"model":"
llama3","created_at":"2024-06-27T18:49:44.153752295Z","response":" to","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:44.350067787Z","response":" a","done":false}\n{"model
":"llama3","created_at":"2024-06-27T18:49:44.527463717Z","response":" system","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:44.703176113Z","response":".","done":false}\n{
"model":"llama3","created_at":"2024-06-27T18:49:44.888343457Z","response":" Is","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.048142075Z","response":" there","done":fa
lse}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.19069661Z","response":" something","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.335927355Z","response":" els
e","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.479472193Z","response":" I","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.642975828Z","response":
" can","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.795531626Z","response":" help","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:45.946282496Z","res
ponse":" you","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:46.211920386Z","response":" with","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:46.384800101
Z","response":"?","done":false}\n{"model":"llama3","created_at":"2024-06-27T18:49:46.520303102Z","response":"","done":true,"done_reason":"stop","context":[128006,882,128007,1432,262,510,91
25,9725,2,2196,8,1084,58,9125,9725,2,78191,340,12,8377,364,15836,54911,26399,3238,3131,198,12,5112,11,1212,264,502,1584,323,1194,330,1061,506,85846,367,304,5208,1131,3343,720,12487,3350,27
9,2768,1495,3094,555,3094,11,323,8911,1855,961,449,912,12908,304,1990,25,720,7,16,8,758,720,7,17,8,510,720,7,18,8,24748,198,7,19,8,2331,720,7,20,8,320,720,7,21,8,3788,1129,86,79571,5181,29
647,3592,44882,28,720,7,22,8,1243,8911,264,10015,220,1049,3752,12399,315,279,2199,323,1057,10652,304,2385,1227,21136,1495,13,720,2746,1070,527,904,34816,477,24511,389,279,2199,8911,1124,11
01,13,720,7,23,8,883,128009,128006,78191,128007,271,40,4250,3493,2038,430,1436,387,1511,311,8895,45571,2680,311,264,1887,13,2209,1070,2555,775,358,649,1520,499,449,30,128009],"total_durati
on":19419994205,"load_duration":3591167367,"prompt_eval_count":153,"prompt_eval_duration":11255389000,"eval_count":27,"eval_duration":4571347000}\n'|

Would anyone know how to resolve it?
Thanks!

@leondz leondz added the bug Something isn't working label Jun 28, 2024
@leondz
Copy link
Owner

leondz commented Jul 5, 2024

Hello! Thank you for this. Can you say which version of garak you're running? Is it a pip install? garak --version should give the info.

@brunomcuesta
Copy link
Author

Hello!
The version is v0.9.0.13.post1.
Thanks!

@leondz
Copy link
Owner

leondz commented Jul 5, 2024

OK, cool, then there might be two bits of good news - it looks like a known bug, and the fix for it is already in dev.

You can switch to that branch with python -m pip install -U git+https://github.com/leondz/garak.git@main

Can you try that, and then re-run the command?

@brunomcuesta
Copy link
Author

Hello!
I installed the latest version with the fix.
I managed to run garak. It finished the scan and generated the report. Apparently, the fix resolved the problem.
Thanks!

@leondz
Copy link
Owner

leondz commented Jul 6, 2024

Fantastic. Thanks for letting us know!

@leondz leondz closed this as completed Jul 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants