Skip to content

OmniTrax Version 0.2.1

Compare
Choose a tag to compare
@FabianPlum FabianPlum released this 28 Nov 12:07
· 130 commits to main since this release

The latest release includes a few minor usability updates and improved exception handling for trained YOLO and DLC-live networks. You can now configure the detection (YOLO) network input resolution within OmniTrax directly. OmniTrax will now automatically create .data as well as .names files for YOLO networks based on the supplied .cfg (config) files, in case they are not provided or formatted incorrectly.

GPU inference compatibility with the latest release of Blender LTS 3.3!

OmniTrax is built for Windows 10 / 11.

Simply install the omni_trax.zip file from within Blender in the Addon tab, as described in the README. No need to unpack the file, but make sure to run Blender in Administrator mode for the installation of this addon as additional python packages will be installed automatically.

Notes on GPU version [recommended]

Install the correct CUDA & cudNN versions (11.2 and 8.1 respectively) mentioned in the readme for GPU support. The DLLs have been built specifically to run on machines with dedicated NVIDIA GPUs with compute capability of 6.1 or higher.

Notes on CPU-only version

If you wish to run inference on CPU-only systems, you must use Blender version 2.9.2 instead to match dependencies!