Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Yet another export yolov5 models to ONNX and inference with TensorRT #1597

Closed
linghu8812 opened this issue Dec 3, 2020 · 27 comments
Closed
Labels
enhancement New feature or request Stale

Comments

@linghu8812
Copy link

linghu8812 commented Dec 3, 2020

Hello everyone, here is a repo that can convert the yolov5 model to ONNX model and inference with TensorRT. The code is here: https://github.com/linghu8812/tensorrt_inference/tree/master/project/yolov5. It supported all yolov5 models including yolov5s, yolov5m, yolov5l and yolov5x. An onnxsim module has been imported to simplify the yolov5 structure. Before simplify the yolov5 onnx structure was shown like this:

image

after simplified the onnx model has been simplified to:

image

some extra node have been simplified.

In addition, a TensorRT inference code has also been supplied, the inference result has been shown below:

image

@linghu8812 linghu8812 added the enhancement New feature or request label Dec 3, 2020
@github-actions
Copy link
Contributor

github-actions bot commented Dec 3, 2020

Hello @linghu8812, thank you for your interest in 🚀 YOLOv5! Please visit our ⭐️ Tutorials to get started, where you can find quickstart guides for simple tasks like Custom Data Training all the way to advanced concepts like Hyperparameter Evolution.

If this is a 🐛 Bug Report, please provide screenshots and minimum viable code to reproduce your issue, otherwise we can not help you.

If this is a custom training ❓ Question, please provide as much information as possible, including dataset images, training logs, screenshots, and a public link to online W&B logging if available.

For business inquiries or professional support requests please visit https://www.ultralytics.com or email Glenn Jocher at glenn.jocher@ultralytics.com.

Requirements

Python 3.8 or later with all requirements.txt dependencies installed, including torch>=1.7. To install run:

$ pip install -r requirements.txt

Environments

YOLOv5 may be run in any of the following up-to-date verified environments (with all dependencies including CUDA/CUDNN, Python and PyTorch preinstalled):

Status

CI CPU testing

If this badge is green, all YOLOv5 GitHub Actions Continuous Integration (CI) tests are currently passing. CI tests verify correct operation of YOLOv5 training (train.py), testing (test.py), inference (detect.py) and export (export.py) on MacOS, Windows, and Ubuntu every 24 hours and on every commit.

@glenn-jocher
Copy link
Member

@linghu8812 very nice! Did you have to configure onnxsim especially to achieve those simplifications or did it do them on its own?

@linghu8812
Copy link
Author

@glenn-jocher hello,

first of all, onnx-simplifier need to be installed with pip install onnx-simplifier,
then, the simplification codes are:

    # ONNX export
    try:
        import onnx
        from onnxsim import simplify

        print('\nStarting ONNX export with onnx %s...' % onnx.__version__)
        f = opt.weights.replace('.pt', '.onnx')  # filename
        torch.onnx.export(model, img, f, verbose=False, opset_version=12, input_names=['images'],
                          output_names=['output'] if y is None else ['output'])

        # Checks
        onnx_model = onnx.load(f)  # load onnx model
        model_simp, check = simplify(onnx_model)
        assert check, "Simplified ONNX model could not be validated"
        onnx.save(model_simp, f)
        # print(onnx.helper.printable_graph(onnx_model.graph))  # print a human readable model
        print('ONNX export success, saved as %s' % f)
    except Exception as e:
        print('ONNX export failure: %s' % e)

@al03
Copy link

al03 commented Dec 22, 2020

@glenn-jocher hello,

first of all, onnx-simplifier need to be installed with pip install onnx-simplifier,
then, the simplification codes are:

    # ONNX export
    try:
        import onnx
        from onnxsim import simplify

        print('\nStarting ONNX export with onnx %s...' % onnx.__version__)
        f = opt.weights.replace('.pt', '.onnx')  # filename
        torch.onnx.export(model, img, f, verbose=False, opset_version=12, input_names=['images'],
                          output_names=['output'] if y is None else ['output'])

        # Checks
        onnx_model = onnx.load(f)  # load onnx model
        model_simp, check = simplify(onnx_model)
        assert check, "Simplified ONNX model could not be validated"
        onnx.save(model_simp, f)
        # print(onnx.helper.printable_graph(onnx_model.graph))  # print a human readable model
        print('ONNX export success, saved as %s' % f)
    except Exception as e:
        print('ONNX export failure: %s' % e)

I tried this export script, but don't get the simplified layer like you showed up.

My layer info of outputs is this:

image

@glenn-jocher
Copy link
Member

@al03 did the model simplify at all? I'm interested to see if it works, I'll try myself.

@glenn-jocher
Copy link
Member

glenn-jocher commented Dec 22, 2020

@al03 @linghu8812 I get an error on import onnxsim, I'm not able to evaluate it. I used pip install onnx-simplifier with Python 3.8.0: https://pypi.org/project/onnx-simplifier/

Screen Shot 2020-12-22 at 3 23 20 PM

Will raise an issue on the onnxsim repo. EDIT: issue raised daquexian/onnx-simplifier#109

@linghu8812
Copy link
Author

https://github.com/linghu8812/yolov5/blob/bc2874fe025430e6f710b7e26054646f88c4e86e/models/yolo.py#L43-L63

@al03 the yolo.py file has a little changed, it makes convenience for C++ decoding boxes from tensors.

@austingg
Copy link

austingg commented Jan 5, 2021

@glenn-jocher you may install onnxruntime library first.

@glenn-jocher
Copy link
Member

glenn-jocher commented Jan 5, 2021

@austingg yes it seems so according to daquexian/onnx-simplifier#109 (comment). I was interested in using onnxsim as part of the default code, but I looked at the install instructions (https://www.onnxruntime.ai/docs/get-started/install.html) and the prerequisites and OS-specific instructions appear too burdensome to include as part of the default repo. I think it would cause users more confusion/problems than it would solve.

If this is a common use case though (and it seems it may be) it might make sense to place these instructions within a Tutorial that we could add to https://docs.ultralytics.com/yolov5. That way expert users could still benefit.

@daquexian
Copy link
Contributor

daquexian commented Feb 1, 2021

@austingg yes it seems so according to daquexian/onnx-simplifier#109 (comment). I was interested in using onnxsim as part of the default code, but I looked at the install instructions (https://www.onnxruntime.ai/docs/get-started/install.html) and the prerequisites and OS-specific instructions appear too burdensome to include as part of the default repo. I think it would cause users more confusion/problems than it would solve.

@glenn-jocher I have updated onnx-simplifier to v0.2.26 so that it depends on onnxruntime-noopenmp instead of onnxruntime according to microsoft/onnxruntime#6511. I believe all instructions in https://www.onnxruntime.ai/docs/get-started/install.html is not indeed needed if we don't depend on openmp, and onnx-simplifier will work like a charm without any additional instructions. Could you please give onnx-simplifier a try? :D

@glenn-jocher
Copy link
Member

@daquexian oh really? I actually gave up on the process before after seeing the complicated dependency requirements. So what exactly are the pip installs required now to use onnx-simplifier? It would be nice to integrate it into export.py if we can get simple dependencies and the installs all pass the CI checks on the 3 main OS's.

@glenn-jocher
Copy link
Member

@daquexian the current CI checks do an ONNX export BTW here:
https://github.com/ultralytics/yolov5/runs/1803124986?check_suite_focus=true
Screen Shot 2021-02-01 at 10 28 47 AM

The export tests are defined here:

python models/export.py --img 128 --batch 1 --weights weights/${{ matrix.model }}.pt # export

Failures in an export won't fail the CI as they are in try except clauses, but it provides nice realtime insight into whether they are working or not, since the tests run every day.

@daquexian
Copy link
Contributor

So what exactly are the pip installs required now to use onnx-simplifier?

Just update https://github.com/ultralytics/yolov5/blob/master/.github/workflows/ci-testing.yml#L51 with pip install -q onnx onnx-simplifier>=0.2.26 and update model/export.py like #1597 (comment). No extra instructions are needed :D

@glenn-jocher
Copy link
Member

@daquexian I was able to install in Colab, but not locally on macos for some reason. Python 3.9 appears incompatible, so I tried with a 3.8.0 environment, but got this. Do you know what the issue might be?

Screen Shot 2021-02-01 at 8 10 19 PM

@daquexian
Copy link
Contributor

daquexian commented Feb 2, 2021

@glenn-jocher Oh it's my fault.. onnxruntime-noopenmp doesn't have a macos version. That's so strange. I'll update this issue when onnxruntime has better macos support.

@al03
Copy link

al03 commented Feb 3, 2021

@al03 did the model simplify at all? I'm interested to see if it works, I'll try myself.
It did work after convert to other format model, but it need to implement the detection layer(grade, sigmoid, nms).

@linghu8812
Copy link
Author

linghu8812 commented Feb 8, 2021

linghu8812/tensorrt_inference#42

It is supported yolov5 4.0, yolov5s6.pt models and so on to inference with TensorRT now!

for yolov5s6, just run with

./yolov5_trt ../config6.yaml ../samples

@github-actions
Copy link
Contributor

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@PiyalGeorge
Copy link

PiyalGeorge commented Apr 21, 2021

@linghu8812 , I converted the yolov5s model to onnx model following this .
Then i simplified the onnx model using this . I tried this in yolov5s - release number :3.1

I wish to know "the output nodes before 5D Reshape". How can i get these output nodes? Kindly help me to know these output nodes.

@glenn-jocher
Copy link
Member

@PiyalGeorge I'm not sure exactly what you mean by the output nodes before 5D reshape, though the 5D reshape is in the Detect layer, so what you are looking for is probably there.

yolov5/models/yolo.py

Lines 24 to 58 in 5f7d39f

class Detect(nn.Module):
stride = None # strides computed during build
export = False # onnx export
def __init__(self, nc=80, anchors=(), ch=()): # detection layer
super(Detect, self).__init__()
self.nc = nc # number of classes
self.no = nc + 5 # number of outputs per anchor
self.nl = len(anchors) # number of detection layers
self.na = len(anchors[0]) // 2 # number of anchors
self.grid = [torch.zeros(1)] * self.nl # init grid
a = torch.tensor(anchors).float().view(self.nl, -1, 2)
self.register_buffer('anchors', a) # shape(nl,na,2)
self.register_buffer('anchor_grid', a.clone().view(self.nl, 1, -1, 1, 1, 2)) # shape(nl,1,na,1,1,2)
self.m = nn.ModuleList(nn.Conv2d(x, self.no * self.na, 1) for x in ch) # output conv
def forward(self, x):
# x = x.copy() # for profiling
z = [] # inference output
self.training |= self.export
for i in range(self.nl):
x[i] = self.m[i](x[i]) # conv
bs, _, ny, nx = x[i].shape # x(bs,255,20,20) to x(bs,3,20,20,85)
x[i] = x[i].view(bs, self.na, self.no, ny, nx).permute(0, 1, 3, 4, 2).contiguous()
if not self.training: # inference
if self.grid[i].shape[2:4] != x[i].shape[2:4]:
self.grid[i] = self._make_grid(nx, ny).to(x[i].device)
y = x[i].sigmoid()
y[..., 0:2] = (y[..., 0:2] * 2. - 0.5 + self.grid[i]) * self.stride[i] # xy
y[..., 2:4] = (y[..., 2:4] * 2) ** 2 * self.anchor_grid[i] # wh
z.append(y.view(bs, -1, self.no))
return x if self.training else (torch.cat(z, 1), x)

On a side note, onnx-simplifier is now integrated with YOLOv5 export via PR #2815, you can access it like this in the latest code:

python export.py --simplify

@bertinma
Copy link

@glenn-jocher Detect Layer still cannot be exported to onnx ?

@glenn-jocher
Copy link
Member

@bertinma yes Detect() layer exports to ONNX with export.py.

@ithmz
Copy link

ithmz commented Apr 23, 2021

Hi, may I ask how you get the last output layer? (1,25200,85)
Thanks

@glenn-jocher
Copy link
Member

glenn-jocher commented Apr 23, 2021

@tsangz189 YOLOv5 output grids are flattened and concatenated to form a single output.

python models/export.py --grid --simplify

Screenshot 2021-04-23 at 13 17 14

@ithmz
Copy link

ithmz commented Apr 26, 2021

@glenn-jocher
Hi Jocher, thanks for your reply, but when I run the script
python models/export.py --simplify
the outputs are not concatenated to form a single output

@glenn-jocher
Copy link
Member

@tsangz189 --grid forms the single output:

python models/export.py --grid --simplify

@ttanzhiqiang
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Stale
Projects
None yet
Development

No branches or pull requests

9 participants