Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pytorch] Update PyTorch engine README for version 2.2.2 #3165

Merged
merged 1 commit into from
May 8, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
74 changes: 37 additions & 37 deletions engines/pytorch/pytorch-engine/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,13 @@ The javadocs output is built in the `build/doc/javadoc` folder.
## Installation
You can pull the PyTorch engine from the central Maven repository by including the following dependency:

- ai.djl.pytorch:pytorch-engine:0.27.0
- ai.djl.pytorch:pytorch-engine:0.28.0

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-engine</artifactId>
<version>0.27.0</version>
<version>0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -46,7 +46,7 @@ The following table illustrates which pytorch version that DJL supports:

| PyTorch engine version | PyTorch native library version |
|------------------------|-------------------------------------------|
| pytorch-engine:0.28.0 | 1.13.1, **2.1.2** |
| pytorch-engine:0.28.0 | 1.13.1, 2.1.2, **2.2.2** |
| pytorch-engine:0.27.0 | 1.13.1, **2.1.1** |
| pytorch-engine:0.26.0 | 1.13.1, 2.0.1, **2.1.1** |
| pytorch-engine:0.25.0 | 1.11.0, 1.12.1, **1.13.1**, 2.0.1 |
Expand Down Expand Up @@ -115,21 +115,21 @@ export PYTORCH_FLAVOR=cpu
### macOS
For macOS, you can use the following library:

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cpu:2.1.1:osx-x86_64

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu</artifactId>
<classifier>osx-x86_64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -139,21 +139,21 @@ For macOS, you can use the following library:
### macOS M1
For macOS M1, you can use the following library:

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cpu:2.1.1:osx-aarch64
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cpu:2.2.2:osx-aarch64

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu</artifactId>
<classifier>osx-aarch64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -164,63 +164,63 @@ installed on your GPU machine, you can use one of the following library:

#### Linux GPU

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cu121:2.1.1:linux-x86_64 - CUDA 12.1
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cu121:2.2.2:linux-x86_64 - CUDA 12.1

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cu121</artifactId>
<classifier>linux-x86_64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```

### Linux CPU

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cpu:2.1.1:linux-x86_64
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cpu:2.2.2:linux-x86_64

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu</artifactId>
<classifier>linux-x86_64</classifier>
<scope>runtime</scope>
<version>2.1.1</version>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```

### For aarch64 build

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.1.1:linux-aarch64
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.2.2:linux-aarch64

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu-precxx11</artifactId>
<classifier>linux-aarch64</classifier>
<scope>runtime</scope>
<version>2.1.1</version>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -230,22 +230,22 @@ installed on your GPU machine, you can use one of the following library:
We also provide packages for the system like CentOS 7/Ubuntu 14.04 with GLIBC >= 2.17.
All the package were built with GCC 7, we provided a newer `libstdc++.so.6.24` in the package that contains `CXXABI_1.3.9` to use the package successfully.

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cu121-precxx11:2.1.1:linux-x86_64 - CUDA 12.1
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.1.1:linux-x86_64 - CPU
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cu121-precxx11:2.2.2:linux-x86_64 - CUDA 12.1
- ai.djl.pytorch:pytorch-native-cpu-precxx11:2.2.2:linux-x86_64 - CPU

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cu121-precxx11</artifactId>
<classifier>linux-x86_64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -255,13 +255,13 @@ All the package were built with GCC 7, we provided a newer `libstdc++.so.6.24` i
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu-precxx11</artifactId>
<classifier>linux-x86_64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Expand All @@ -276,42 +276,42 @@ For the Windows platform, you can choose between CPU and GPU.

#### Windows GPU

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cu121:2.1.1:win-x86_64 - CUDA 12.1
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cu121:2.2.2:win-x86_64 - CUDA 12.1

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cu121</artifactId>
<classifier>win-x86_64</classifier>
<version>2.1.1</version>
<version>2.2.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```

### Windows CPU

- ai.djl.pytorch:pytorch-jni:2.1.1-0.27.0
- ai.djl.pytorch:pytorch-native-cpu:2.1.1:win-x86_64
- ai.djl.pytorch:pytorch-jni:2.2.2-0.28.0
- ai.djl.pytorch:pytorch-native-cpu:2.2.2:win-x86_64

```xml
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-native-cpu</artifactId>
<classifier>win-x86_64</classifier>
<scope>runtime</scope>
<version>2.1.1</version>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>ai.djl.pytorch</groupId>
<artifactId>pytorch-jni</artifactId>
<version>2.1.1-0.27.0</version>
<version>2.2.2-0.28.0</version>
<scope>runtime</scope>
</dependency>
```
Loading