Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TensorRT] Caching to a dedicated ONNX file does not work #21307

Closed
gedoensmax opened this issue Jul 10, 2024 · 2 comments
Closed

[TensorRT] Caching to a dedicated ONNX file does not work #21307

gedoensmax opened this issue Jul 10, 2024 · 2 comments
Labels
ep:TensorRT issues related to TensorRT execution provider

Comments

@gedoensmax
Copy link
Contributor

Describe the issue

Below command produces an embedded TRT engine at ./test/model_ctx.onnx:

.\onnxruntime_perf_test.exe -I -e tensorrt -r 10 -i "trt_timing_cache_enable|1 trt_engine_cache_enable|1 trt_dump_ep_context_model|1 trt_ep_context_file_path|ltest" model.onnx

With this on the other hand I do not get an embedded ONNX at test_ctx.onnx which is what I would expect.

.\onnxruntime_perf_test.exe -I -e tensorrt -r 10 -i "trt_timing_cache_enable|1 trt_engine_cache_enable|1 trt_dump_ep_context_model|1 trt_ep_context_file_path|ltest_ctx.onnx" model.onnx

@chilo-ms Have you noticed this before ?

To reproduce

Run onnxruntime perf test with above command line.

Urgency

No response

Platform

Windows

OS Version

11

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.18.1

ONNX Runtime API

C++

Architecture

X64

Execution Provider

TensorRT

Execution Provider Library Version

10.2

@github-actions github-actions bot added ep:TensorRT issues related to TensorRT execution provider platform:windows issues related to the Windows platform labels Jul 10, 2024
@chilo-ms
Copy link
Contributor

That's strange, i can't repro.

It seems there is a typo in the command line argument, ltest_ctx.onnx -> test_ctx.onnx.

Are you checking the folder where onnxruntime_perf_test.exe locates to find the embedded ONNX? I assume it's not disk full.
How about turning on verbose log to see? Or we might need to use debugger to see this line

...
2024-07-10 12:25:56.0580589 [V:onnxruntime:Default, onnx_ctx_model_helper.cc:213 onnxruntime::DumpCtxModel] [TensorRT EP] Dumped ltest_ctx.onnx
...

@gedoensmax
Copy link
Contributor Author

Ok I see my path definitions were slightly different. If I provide an absolute path as context node it does not work but it also does not error or throw me a warning I searched the full verbose log.

@sophies927 sophies927 removed the platform:windows issues related to the Windows platform label Jul 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:TensorRT issues related to TensorRT execution provider
Projects
None yet
Development

No branches or pull requests

3 participants