Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fail to transfer .pt into .onnx #401

Open
ziqi-jin opened this issue Jul 22, 2022 · 0 comments
Open

Fail to transfer .pt into .onnx #401

ziqi-jin opened this issue Jul 22, 2022 · 0 comments

Comments

@ziqi-jin
Copy link

if you can not generate .onnx file when you are using the models/export.py, maybe you could try my method.

1. using the docker container supplied by the author in the README.md

2. modify the models/export.py in 2 steps:

  • step1: change the model.model[-1].export = True into model.model[-1].export = False
  • step2: add code
    for k, m in model.named_modules():
        m._non_persistent_buffers_set = set()  # pytorch 1.6.0 compatability
        if isinstance(m, models.common.Conv) and isinstance(m.act, models.common.Mish):
            m.act = Mish()  # assign activation
        if isinstance(m, models.common.BottleneckCSP) or isinstance(m, models.common.BottleneckCSP2) \
                or isinstance(m, models.common.SPPCSP):
            if isinstance(m.bn, nn.SyncBatchNorm):
                bn = nn.BatchNorm2d(m.bn.num_features, eps=m.bn.eps, momentum=m.bn.momentum)
                bn.training = False
                bn._buffers = m.bn._buffers
                bn._non_persistent_buffers_set = set()
                m.bn = bn
            if isinstance(m.act, models.common.Mish):
                m.act = Mish()  # assign activation

above the line model.model[-1].export = False

  • you may also need to import some necessary package like torch.nn
    enjoy it!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant