Skip to content

Commit

Permalink
* Upgrade presets for OpenCV 4.8.1, DNNL 3.3, Leptonica 1.83.1, Tess…
Browse files Browse the repository at this point in the history
…eract 5.3.3, TensorFlow Lite 2.14.0, ONNX Runtime 1.16.1
  • Loading branch information
saudet committed Oct 13, 2023
1 parent 275be0b commit 6693a42
Show file tree
Hide file tree
Showing 308 changed files with 1,245 additions and 244 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
* Refactor and improve presets for PyTorch ([pull #1360](https://github.com/bytedeco/javacpp-presets/pull/1360))
* Include `mkl_lapack.h` header file in presets for MKL ([issue #1388](https://github.com/bytedeco/javacpp-presets/issues/1388))
* Map new higher-level C++ API of Triton Inference Server ([pull #1361](https://github.com/bytedeco/javacpp-presets/pull/1361))
* Upgrade presets for OpenCV 4.8.0, DNNL 3.2.1, OpenBLAS 0.3.24, CPython 3.11.5, NumPy 1.25.2, SciPy 1.11.2, LLVM 17.0.1, TensorFlow Lite 2.14.0, Triton Inference Server 2.38.0, ONNX 1.14.1, ONNX Runtime 1.16.0, TVM 0.13.0, and their dependencies
* Upgrade presets for OpenCV 4.8.1, DNNL 3.3, OpenBLAS 0.3.24, CPython 3.11.5, NumPy 1.25.2, SciPy 1.11.2, LLVM 17.0.1, Leptonica 1.83.1, Tesseract 5.3.3, TensorFlow Lite 2.14.0, Triton Inference Server 2.38.0, ONNX 1.14.1, ONNX Runtime 1.16.1, TVM 0.13.0, and their dependencies

### June 6, 2023 version 1.5.9
* Virtualize `nvinfer1::IGpuAllocator` from TensorRT to allow customization ([pull #1367](https://github.com/bytedeco/javacpp-presets/pull/1367))
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,7 @@ Each child module in turn relies by default on the included [`cppbuild.sh` scrip
* LZ4 1.9.x https://github.com/lz4/lz4
* MKL 2023.x https://software.intel.com/mkl
* MKL-DNN 0.21.x https://github.com/oneapi-src/oneDNN
* DNNL 3.2.x https://github.com/oneapi-src/oneDNN
* DNNL 3.3.x https://github.com/oneapi-src/oneDNN
* OpenBLAS 0.3.24 http://www.openblas.net/
* ARPACK-NG 3.9.0 https://github.com/opencollab/arpack-ng
* CMINPACK 1.3.8 https://github.com/devernay/cminpack
Expand Down Expand Up @@ -227,12 +227,12 @@ Each child module in turn relies by default on the included [`cppbuild.sh` scrip
* TensorFlow 1.15.x https://github.com/tensorflow/tensorflow
* TensorFlow Lite 2.14.x https://github.com/tensorflow/tensorflow
* TensorRT 8.6.x https://developer.nvidia.com/tensorrt
* Triton Inference Server 2.34.x https://developer.nvidia.com/nvidia-triton-inference-server
* Triton Inference Server 2.38.x https://developer.nvidia.com/nvidia-triton-inference-server
* The Arcade Learning Environment 0.8.x https://github.com/mgbellemare/Arcade-Learning-Environment
* DepthAI 2.21.x https://github.com/luxonis/depthai-core
* ONNX 1.14.x https://github.com/onnx/onnx
* nGraph 0.26.0 https://github.com/NervanaSystems/ngraph
* ONNX Runtime 1.15.x https://github.com/microsoft/onnxruntime
* ONNX Runtime 1.16.x https://github.com/microsoft/onnxruntime
* TVM 0.13.x https://github.com/apache/tvm
* Bullet Physics SDK 3.25 https://pybullet.org
* LiquidFun http://google.github.io/liquidfun/
Expand Down
6 changes: 3 additions & 3 deletions dnnl/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Introduction
------------
This directory contains the JavaCPP Presets module for:

* DNNL 3.2.1 https://01.org/dnnl
* DNNL 3.3 https://01.org/dnnl

Please refer to the parent README.md file for more detailed information about the JavaCPP Presets.

Expand All @@ -25,7 +25,7 @@ Sample Usage
------------
Here is a simple example of DNNL ported to Java from this C++ source file:

* https://github.com/oneapi-src/oneDNN/blob/v3.2.1/examples/cnn_inference_int8.cpp
* https://github.com/oneapi-src/oneDNN/blob/v3.3/examples/cnn_inference_int8.cpp

We can use [Maven 3](http://maven.apache.org/) to download and install automatically all the class files as well as the native binaries. To run this sample code, after creating the `pom.xml` and `CpuCnnInferenceInt8.java` source files below, simply execute on the command line:
```bash
Expand All @@ -46,7 +46,7 @@ We can use [Maven 3](http://maven.apache.org/) to download and install automatic
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>dnnl-platform</artifactId>
<version>3.2.1-1.5.10-SNAPSHOT</version>
<version>3.3-1.5.10-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion dnnl/cppbuild.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ export DNNL_CPU_RUNTIME="OMP" # or TBB
export DNNL_GPU_RUNTIME="OCL"

TBB_VERSION=2020.3
MKLDNN_VERSION=3.2.1
MKLDNN_VERSION=3.3
download https://github.com/oneapi-src/oneTBB/archive/v$TBB_VERSION.tar.gz oneTBB-$TBB_VERSION.tar.bz2
download https://github.com/oneapi-src/oneDNN/archive/v$MKLDNN_VERSION.tar.gz oneDNN-$MKLDNN_VERSION.tar.bz2

Expand Down
2 changes: 1 addition & 1 deletion dnnl/platform/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

<groupId>org.bytedeco</groupId>
<artifactId>dnnl-platform</artifactId>
<version>3.2.1-${project.parent.version}</version>
<version>3.3-${project.parent.version}</version>
<name>JavaCPP Presets Platform for DNNL</name>

<properties>
Expand Down
2 changes: 1 addition & 1 deletion dnnl/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

<groupId>org.bytedeco</groupId>
<artifactId>dnnl</artifactId>
<version>3.2.1-${project.parent.version}</version>
<version>3.3-${project.parent.version}</version>
<name>JavaCPP Presets for DNNL</name>

<dependencies>
Expand Down
2 changes: 1 addition & 1 deletion dnnl/samples/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>dnnl-platform</artifactId>
<version>3.2.1-1.5.10-SNAPSHOT</version>
<version>3.3-1.5.10-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
Expand Down
24 changes: 18 additions & 6 deletions dnnl/src/gen/java/org/bytedeco/dnnl/concat.java
Original file line number Diff line number Diff line change
Expand Up @@ -71,16 +71,22 @@ public static class primitive_desc extends primitive_desc_base {
* not depend on memory format.
* @param srcs Vector of source memory descriptors.
* @param attr Primitive attributes to use. Attributes are optional
* and default to empty attributes. */
* and default to empty attributes.
* @param allow_empty A flag signifying whether construction is
* allowed to fail without throwing an exception. In this case an
* empty object will be produced. This flag is optional and
* defaults to false. */

///
///
public primitive_desc(@Const @ByRef engine aengine, @Const @ByRef org.bytedeco.dnnl.memory.desc dst,
int concat_dimension, @Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs,
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr) { super((Pointer)null); allocate(aengine, dst, concat_dimension, srcs, attr); }
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr,
@Cast("bool") boolean allow_empty/*=false*/) { super((Pointer)null); allocate(aengine, dst, concat_dimension, srcs, attr, allow_empty); }
private native void allocate(@Const @ByRef engine aengine, @Const @ByRef org.bytedeco.dnnl.memory.desc dst,
int concat_dimension, @Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs,
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr);
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr,
@Cast("bool") boolean allow_empty/*=false*/);
public primitive_desc(@Const @ByRef engine aengine, @Const @ByRef org.bytedeco.dnnl.memory.desc dst,
int concat_dimension, @Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs) { super((Pointer)null); allocate(aengine, dst, concat_dimension, srcs); }
private native void allocate(@Const @ByRef engine aengine, @Const @ByRef org.bytedeco.dnnl.memory.desc dst,
Expand All @@ -98,15 +104,21 @@ private native void allocate(@Const @ByRef engine aengine, @Const @ByRef org.byt
* not depend on memory format.
* @param srcs Vector of source memory descriptors.
* @param attr Primitive attributes to use. Attributes are optional
* and default to empty attributes. */
* and default to empty attributes.
* @param allow_empty A flag signifying whether construction is
* allowed to fail without throwing an exception. In this case an
* empty object will be produced. This flag is optional and
* defaults to false. */

///
public primitive_desc(@Const @ByRef engine aengine, int concat_dimension,
@Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs,
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr) { super((Pointer)null); allocate(aengine, concat_dimension, srcs, attr); }
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr,
@Cast("bool") boolean allow_empty/*=false*/) { super((Pointer)null); allocate(aengine, concat_dimension, srcs, attr, allow_empty); }
private native void allocate(@Const @ByRef engine aengine, int concat_dimension,
@Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs,
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr);
@Const @ByRef(nullValue = "dnnl::primitive_attr()") primitive_attr attr,
@Cast("bool") boolean allow_empty/*=false*/);
public primitive_desc(@Const @ByRef engine aengine, int concat_dimension,
@Cast("const std::vector<dnnl::memory::desc>*") @ByRef memory_desc_vector srcs) { super((Pointer)null); allocate(aengine, concat_dimension, srcs); }
private native void allocate(@Const @ByRef engine aengine, int concat_dimension,
Expand Down
113 changes: 107 additions & 6 deletions dnnl/src/gen/java/org/bytedeco/dnnl/global/dnnl.java
Original file line number Diff line number Diff line change
Expand Up @@ -1208,10 +1208,15 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
dnnl_ABc16b48a = 759,
dnnl_ABcd16b48a = 760,
dnnl_ABcde16b48a = 761,
dnnl_ABc16a4b = 762,
dnnl_ABcd16a4b = 763,
dnnl_ABcde16a4b = 764,
dnnl_defcbA16a = 765,
dnnl_defcbA8a = 766,

/** Just a sentinel, not real memory format tag. Must be changed after new
* format tag is added. */
dnnl_format_tag_last = 762,
dnnl_format_tag_last = 767,

// Aliases

Expand Down Expand Up @@ -2016,6 +2021,8 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
dnnl_gIdhwO16o64i4o = dnnl_aCdefB16b64c4b,
dnnl_hwioG16g = dnnl_decbA16a,
dnnl_hwioG8g = dnnl_decbA8a,
dnnl_dhwioG16g = dnnl_defcbA16a,
dnnl_dhwioG8g = dnnl_defcbA8a,
dnnl_NCdhw40n16c = dnnl_ABcde40a16b,
dnnl_NCw40n16c = dnnl_ABc40a16b,
dnnl_NChw40n16c = dnnl_ABcd40a16b,
Expand Down Expand Up @@ -2138,6 +2145,8 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
dnnl_softmax = 19,
/** A layer normalization primitive. */
dnnl_layer_normalization = 20,
/** A group normalization primitive. */
dnnl_group_normalization = 21,

/** Parameter to allow internal only primitives without undefined behavior.
* This parameter is chosen to be valid for so long as sizeof(int) >= 2. */
Expand Down Expand Up @@ -3025,7 +3034,7 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
// Parsed from oneapi/dnnl/dnnl_common.h

/*******************************************************************************
* Copyright 2022 Intel Corporation
* Copyright 2022-2023 Intel Corporation
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
Expand Down Expand Up @@ -3205,8 +3214,8 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
*
* @param level Verbosity level:
* - 0: no verbose output (default),
* - 1: primitive information at execution,
* - 2: primitive information at creation and execution.
* - 1: primitive and graph information at execution,
* - 2: primitive and graph information at creation/compilation and execution.
* @return #dnnl_invalid_arguments/#dnnl::status::invalid_arguments if the
* \p level value is invalid, and #dnnl_success/#dnnl::status::success on
* success. */
Expand Down Expand Up @@ -3399,6 +3408,7 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
public static final int BUILD_CONVOLUTION = 0;
public static final int BUILD_DECONVOLUTION = 0;
public static final int BUILD_ELTWISE = 0;
public static final int BUILD_GROUP_NORMALIZATION = 0;
public static final int BUILD_INNER_PRODUCT = 0;
public static final int BUILD_LAYER_NORMALIZATION = 0;
public static final int BUILD_LRN = 0;
Expand Down Expand Up @@ -3426,6 +3436,12 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
public static final int BUILD_XEHP = 0;
public static final int BUILD_XEHPG = 0;
public static final int BUILD_XEHPC = 0;
// GeMM kernels ISA controls
public static final int BUILD_GEMM_KERNELS_ALL = 1;
public static final int BUILD_GEMM_KERNELS_NONE = 0;
public static final int BUILD_GEMM_SSE41 = 0;
public static final int BUILD_GEMM_AVX2 = 0;
public static final int BUILD_GEMM_AVX512 = 0;
// #endif


Expand Down Expand Up @@ -3456,10 +3472,10 @@ public class dnnl extends org.bytedeco.dnnl.presets.dnnl {
public static final int DNNL_VERSION_MAJOR = 3;

/** Minor version */
public static final int DNNL_VERSION_MINOR = 2;
public static final int DNNL_VERSION_MINOR = 3;

/** Patch version */
public static final int DNNL_VERSION_PATCH = 1;
public static final int DNNL_VERSION_PATCH = 0;

/** Git commit hash */
public static native @MemberGetter String DNNL_VERSION_HASH();
Expand Down Expand Up @@ -6172,6 +6188,85 @@ public static native int dnnl_memory_desc_equal(
@Const dnnl_primitive_attr attr);

/** \} dnnl_api_batch_normalization
<p>
* \addtogroup dnnl_api_group_normalization
* \{
<p>
* Creates a primitive descriptor for a group normalization forward propagation
* primitive.
*
* \note
* In-place operation is supported: the dst can refer to the same memory
* as the src.
*
* @param primitive_desc Output primitive_descriptor.
* @param engine Engine to use.
* @param prop_kind Propagation kind. Possible values are
* #dnnl_forward_training and #dnnl_forward_inference.
* @param src_desc Source memory descriptor.
* @param dst_desc Destination memory descriptor.
* @param groups Group normalization groups parameter.
* @param epsilon Group normalization epsilon parameter.
* @param flags Group normalization flags (\ref dnnl_normalization_flags_t).
* @param attr Primitive attributes (can be NULL).
* @return #dnnl_success on success and a status describing the error
* otherwise. */

///
///
public static native @Cast("dnnl_status_t") int dnnl_group_normalization_forward_primitive_desc_create(
@ByPtrPtr dnnl_primitive_desc primitive_desc, dnnl_engine engine,
@Cast("dnnl_prop_kind_t") int prop_kind, @Const dnnl_memory_desc src_desc,
@Const dnnl_memory_desc dst_desc, @Cast("dnnl_dim_t") long groups, float epsilon,
@Cast("unsigned") int flags, @Const dnnl_primitive_attr attr);
public static native @Cast("dnnl_status_t") int dnnl_group_normalization_forward_primitive_desc_create(
@Cast("dnnl_primitive_desc_t*") PointerPointer primitive_desc, dnnl_engine engine,
@Cast("dnnl_prop_kind_t") int prop_kind, @Const dnnl_memory_desc src_desc,
@Const dnnl_memory_desc dst_desc, @Cast("dnnl_dim_t") long groups, float epsilon,
@Cast("unsigned") int flags, @Const dnnl_primitive_attr attr);

/** Creates a primitive descriptor for a group normalization backward
* propagation primitive.
*
* \note
* In-place operation is supported: the diff_dst can refer to the same
* memory as the diff_src.
*
* @param primitive_desc Output primitive_descriptor.
* @param engine Engine to use.
* @param prop_kind Propagation kind. Possible values are
* #dnnl_backward_data and #dnnl_backward (diffs for all parameters are
* computed in this case).
* @param diff_src_desc Diff source memory descriptor.
* @param diff_dst_desc Diff destination memory descriptor.
* @param src_desc Source memory descriptor.
* @param groups Group normalization groups parameter.
* @param epsilon Group normalization epsilon parameter.
* @param flags Group normalization flags (\ref dnnl_normalization_flags_t).
* @param hint_fwd_pd Primitive descriptor for a respective forward propagation
* primitive.
* @param attr Primitive attributes (can be NULL).
* @return #dnnl_success on success and a status describing the error
* otherwise. */

///
///
public static native @Cast("dnnl_status_t") int dnnl_group_normalization_backward_primitive_desc_create(
@ByPtrPtr dnnl_primitive_desc primitive_desc, dnnl_engine engine,
@Cast("dnnl_prop_kind_t") int prop_kind, @Const dnnl_memory_desc diff_src_desc,
@Const dnnl_memory_desc diff_dst_desc,
@Const dnnl_memory_desc src_desc, @Cast("dnnl_dim_t") long groups, float epsilon,
@Cast("unsigned") int flags, @Const dnnl_primitive_desc hint_fwd_pd,
@Const dnnl_primitive_attr attr);
public static native @Cast("dnnl_status_t") int dnnl_group_normalization_backward_primitive_desc_create(
@Cast("dnnl_primitive_desc_t*") PointerPointer primitive_desc, dnnl_engine engine,
@Cast("dnnl_prop_kind_t") int prop_kind, @Const dnnl_memory_desc diff_src_desc,
@Const dnnl_memory_desc diff_dst_desc,
@Const dnnl_memory_desc src_desc, @Cast("dnnl_dim_t") long groups, float epsilon,
@Cast("unsigned") int flags, @Const dnnl_primitive_desc hint_fwd_pd,
@Const dnnl_primitive_attr attr);

/** \} dnnl_api_group_normalization
<p>
* \addtogroup dnnl_api_layer_normalization
* \{
Expand Down Expand Up @@ -9192,6 +9287,12 @@ public static native int dnnl_memory_desc_equal(
// Targeting ../batch_normalization_backward.java


// Targeting ../group_normalization_forward.java


// Targeting ../group_normalization_backward.java


// Targeting ../layer_normalization_forward.java


Expand Down
Loading

0 comments on commit 6693a42

Please sign in to comment.