Skip to content

Latest commit

 

History

History
37 lines (21 loc) · 1.5 KB

deploy.md

File metadata and controls

37 lines (21 loc) · 1.5 KB

Deploy YOLO-World

We provide several ways to deploy YOLO-World with ONNX or TensorRT

Priliminaries

pip install supervision onnx onnxruntime onnxsim

Export ONNX on Gradio Demo

start the demo.py and you can modify the texts in the demo and output the ONNX model.

python demo.py path/to/config path/to/weights

Export YOLO-World to ONNX models

You can also use export_onnx.py to obtain the ONNX model. You might specify the --custom-text with your own Text JSON for your custom prompts. The format of Text JSON can be found in docs/data.

PYTHONPATH=./ python deploy/export_onnx.py path/to/config path/to/weights --custom-text path/to/customtexts --opset 11

Export YOLO-World to TensorRT models

coming soon.

FAQ

Q1. RuntimeError: Exporting the operator einsum to ONNX opset version 11 is not supported. Support for this operator was added in version 12, try exporting with this version.

A: This error arises because YOLO-World adopts einsum for matrix multiplication while it is not supported by opset 11. You can set the --opset from 11 to 12 if your device supports or change the einsum to normal permute/reshape/multiplication by set use_einsum=False in the MaxSigmoidCSPLayerWithTwoConv and YOLOWorldHeadModule. You can refer to the sample config without einsum.