Skip to content

use onnx inference architecture in intelligent construction. (.pth to .onnx to .pb)

Notifications You must be signed in to change notification settings

June1124/torch-onnx-tensorflow

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

How to Use ONNX

Build Status

1.pip install

pip install onnx

pip install onnx_tf

pip install onnxruntime

You need to pay attention to version of onnx and tensorflow.

The last onnx_tf is supported for tensorflow >= 2.2.0.

2.torch to onnx

import torch

torch.onnx.export(model=,
				  args=,
				  f=,
				  input_names=,
				  output_names=)

3.onnx to tensorflow

import onnx
import onnx_tf

model = onnx.load(file_path)
tf_export = onnx_tf.prepare(model)
tf_export.export_graph('XXX.pb')

4.use onnx to infer

import onnxruntime

sess = onnxruntime.InferenceSession('XXX.onnx')
inputs_name = sess.get_inputs[0].name
outputs_name = sess.get_outputs[0].name

results = sess.run([], {'input1': [array format]})

About

use onnx inference architecture in intelligent construction. (.pth to .onnx to .pb)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%