🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
-
Updated
Oct 23, 2024 - C++
🛠 A lite C++ toolkit of awesome AI models, support ONNXRuntime, MNN, TNN, NCNN and TensorRT.
An OBS plugin for removing background in portrait images (video), making it easy to replace the background when recording or streaming.
It is a simple library to speed up CLIP inference up to 3x (K80 GPU)
「PyTorch Implementation of AnimeGANv2」のPythonでのONNX推論サンプル
Python scripts for performing 6D pose estimation and shape reconstruction using the CenterSnap model in ONNX
Jetson Nano Setup without Monitor for JetBot Build. JupyterLab, ROS2 Dasing, Torch, Torch2trt, ONNX, ONNXRuntime-GPU and TensorFlow Installation Included. JupyterLab doesn't require Docker Container.
Text Detection and Recognition using ONNX
Contains ROS2 packages to run robot car to collect annotated camera images while controlled by Gamepad. Also contains packages to run a robot car autonomously with a trained neural network.
Tennis match analysis via computer vision techniques.
「PyTorch Implementation of AnimeGANv2」を用いて、生成した顔画像を元の画像に上書きするデモ
Tools for simple inference testing using TensorRT, CUDA and OpenVINO CPU/GPU and CPU providers. Simple Inference Test for ONNX.
Drop-in replacement for onnxruntime-node with GPU support using CUDA or DirectML
Simple tool to profile onnx inference with C++ APIs.
yolo model benchmark onnxruntime web. Support WebGPU and wasm(cpu)
Python scripts performing semantic segmentation using the TopFormer model in ONNX.
State of the art image upscaling, directly in your browser.
Yolo object detection onnxruntime-web, Support WebGPU, wasm(cpu)
This project makes it easier to make detections using onnx models and cuda's speed.
The installation of the InsightFace package on a Windows environment, including the necessary dependencies and configurations.
Add a description, image, and links to the onnxruntime-gpu topic page so that developers can more easily learn about it.
To associate your repository with the onnxruntime-gpu topic, visit your repo's landing page and select "manage topics."