Skip to content

Custom Release Branch OVEP1.13

Pre-release
Pre-release
Compare
Choose a tag to compare
@sfatimar sfatimar released this 03 Apr 14:06

We are releasing Custom Release for 1.13.1 with specific changes for Model Caching and improving First Inference Latency
This release is based on custom OpenVINO™. Dependent OpenVINO™ libs are part of zip file.

  • Added additional ONNX op support coverage.
  • Improved FIL with custom OpenVINO API for model loading.
  • Model caching along with Kernel caching is enabled.
  • Handled fallback at session creation time at the application level.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html