From e72901cc3937d6bb647e8a63648368da9b560759 Mon Sep 17 00:00:00 2001 From: regisss <15324346+regisss@users.noreply.github.com> Date: Fri, 21 Jul 2023 11:44:22 +0200 Subject: [PATCH] Add documentation for Optimum Furiosa (#1165) * Add documentation for Optimum Furiosa * Add furiosa-libnux installation * Update main doc build * Refinement * Refinements * Revert section rename * Fix --- .../workflows/build_main_documentation.yml | 30 ++++++++++++++++++- .github/workflows/build_pr_documentation.yml | 30 ++++++++++++++++++- docs/combine_docs.py | 5 ++-- docs/source/index.mdx | 8 ++--- 4 files changed, 64 insertions(+), 9 deletions(-) diff --git a/.github/workflows/build_main_documentation.yml b/.github/workflows/build_main_documentation.yml index b35a7a74a6..e82043be98 100644 --- a/.github/workflows/build_main_documentation.yml +++ b/.github/workflows/build_main_documentation.yml @@ -44,6 +44,11 @@ jobs: repository: 'huggingface/optimum-intel' path: optimum-intel + - uses: actions/checkout@v2 + with: + repository: 'huggingface/optimum-furiosa' + path: optimum-furiosa + - name: Set environment variables run: | cd optimum @@ -76,6 +81,7 @@ jobs: - name: Make Habana documentation run: | + sudo docker system prune -a -f cd optimum-habana make doc BUILD_DIR=habana-doc-build VERSION=${{ env.VERSION }} sudo mv habana-doc-build ../optimum @@ -83,11 +89,33 @@ jobs: - name: Make Intel documentation run: | + sudo docker system prune -a -f cd optimum-intel make doc BUILD_DIR=intel-doc-build VERSION=${{ env.VERSION }} sudo mv intel-doc-build ../optimum cd .. + - name: Make Furiosa documentation + run: | + cd optimum-furiosa + pip install . + sudo apt update + sudo apt install -y ca-certificates apt-transport-https gnupg + sudo apt-key adv --keyserver keyserver.ubuntu.com --recv-key 5F03AFA423A751913F249259814F888B20B09A7E + sudo tee -a /etc/apt/auth.conf.d/furiosa.conf > /dev/null < /dev/null <
AWS Trainium/Inferentia

Accelerate your training and inference workflows with AWS Trainium and AWS Inferentia

+
FuriosaAI
+

Fast and efficient inference on FuriosaAI WARBOY

+
ONNX Runtime

Apply quantization and graph optimization to accelerate Transformers models training and inference with ONNX Runtime

-
Exporters
-

Export your PyTorch or TensorFlow model to different formats such as ONNX and TFLite

-
BetterTransformer

A one-liner integration to use PyTorch's BetterTransformer with Transformers models