Skip to content

Commit

Permalink
[Cherry-Pick] Doc fixes (#1583)
Browse files Browse the repository at this point in the history
* YOLOv5 doc fixes (#1574)

* IC doc fixes (#1577)

* Add explixit open-cv dep (#1575)

* Transformers doc fixes (#1578)

* Fix yolov8 export alias (#1579)
  • Loading branch information
KSGulin committed May 26, 2023
1 parent 4a6044b commit 8d83570
Show file tree
Hide file tree
Showing 18 changed files with 31 additions and 32 deletions.
14 changes: 6 additions & 8 deletions integrations/huggingface-transformers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,9 @@ Once trained, SparseML enables you to export models to the ONNX format, such tha
Install with `pip`:

```bash
pip install sparseml[torch]
pip install sparseml[transformers]
```

**Note**: Transformers will not immediately install with this command. Instead, a sparsification-compatible version of Transformers will install on the first invocation of the Transformers code in SparseML.

## **Tutorials**

- [Sparse Transfer Learning with the Python API](tutorials/sparse-transfer-learning-bert-python.md) [**RECOMMENDED**]
Expand All @@ -50,13 +48,13 @@ pip install sparseml[torch]
### **Use Case Examples - Python**

- [Sparse Transfer with GLUE Datasets (SST2) for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-sst2.ipynb)
- [Sparse Transfer with Custom Datasets (RottenTomatoes) and Custom Teacher from HF Hub for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-custom-teacher-rottentomatoes)
- [Sparse Transfer with Custom Datasets (RottenTomatoes) and Custom Teacher from HF Hub for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-custom-teacher-rottentomatoes.ipynb)
- [Sparse Transfer with GLUE Datasets (QQP) for multi-input text classification](tutorials/text-classification/docs-text-classification-python-qqp.ipynb)
- [Sparse Transfer with Custom Datasets (SICK) for multi-input text classification](tutorials/text-classification/docs-text-classification-python-sick.ipynb)
- [Sparse Transfer with Custom Datasets (TweetEval) and Custom Teacher for single input text classificaiton](tutorials/text-classification/docs-text-classification-python-custom-teacher-tweeteval.ipynb)
- [Sparse Transfer with Custom Datasets (GoEmotions) for multi-label text classification](tutorials/text-classification/docs-text-classification-python-multi-label-go_emotions.ipynb)
- [Sparse Transfer with Conll2003 for named entity recognition](tutorials/token-classification/docs-token-classification-python-conll2003.ipynb)
- [Sparse Transfer with Custom Datasets (WNUT) and Custom Teacher for named entity recognition](tutorials/token-classification/docs-token-classification-custom-teacher-wnut.ipynb)
- [Sparse Transfer with Custom Datasets (WNUT) and Custom Teacher for named entity recognition](tutorials/token-classification/docs-token-classification-python-custom-teacher-wnut.ipynb)
- Sparse Transfer with SQuAD (example coming soon!)
- Sparse Transfer with Squadshifts Amazon (example coming soon!)

Expand All @@ -66,7 +64,7 @@ pip install sparseml[torch]

SparseZoo is an open-source repository of pre-sparsified models, including BERT-base, BERT-large, RoBERTa-base, RoBERTa-large, and DistillBERT. With SparseML, you can fine-tune these pre-sparsified checkpoints onto custom datasets (while maintaining sparsity) via sparse transfer learning. This makes training inference-optimized sparse models almost identical to your typical training workflows!

[Check out the available models](https://sparsezoo.neuralmagic.com/?repo=huggingface&page=1)
[Check out the available models](https://sparsezoo.neuralmagic.com/?repos=huggingface)

### **Recipes**

Expand Down Expand Up @@ -140,15 +138,15 @@ Currently supported tasks include:

Sparse Transfer is very similiar to the typical transfer learing process used to train NLP models, where we fine-tune a checkpoint pretrained on a large upstream dataset using masked language modeling onto a smaller downstream dataset. With Sparse Transfer Learning, however, we simply start the fine-tuning process from a pre-sparsified checkpoint and maintain sparsity while the training process occurs.

Here, we will fine-tune a [90% pruned version of BERT](https://sparsezoo.neuralmagic.com/models/nlp%2Fmasked_language_modeling%2Fobert-base%2Fpytorch%2Fhuggingface%2Fwikipedia_bookcorpus%2Fpruned90-none) from the SparseZoo onto SST2.
Here, we will fine-tune a [90% pruned version of BERT](https://sparsezoo.neuralmagic.com/models/obert-base-wikipedia_bookcorpus-pruned90?comparison=obert-base-wikipedia_bookcorpus-base) from the SparseZoo onto SST2.

### **Kick off Training**

We will use SparseML's `sparseml.transformers.text_classification` training script.

To run sparse transfer learning, we first need to create/select a sparsification recipe. For sparse transfer, we need a recipe that instructs SparseML to maintain sparsity during training and to quantize the model.

For the SST2 dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/nlp%2Fsentiment_analysis%2Fobert-base%2Fpytorch%2Fhuggingface%2Fsst2%2Fpruned90_quant-none), identified by the following SparseZoo stub:
For the SST2 dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/obert-base-sst2_wikipedia_bookcorpus-pruned90_quantized?comparison=obert-base-sst2_wikipedia_bookcorpus-base&tab=0), identified by the following SparseZoo stub:
```
zoo:nlp/sentiment_analysis/obert-base/pytorch/huggingface/sst2/pruned90_quant-none
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In this tutorial, you will learn how to:
Install SparseML via `pip`:

```bash
pip install sparseml[torch]
pip install sparseml[transformers]
```

## **Sparse Transfer Learning onto SQuAD**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In this tutorial, you will learn how to:
Install SparseML via `pip`:

```bash
pip install sparseml[torch]
pip install sparseml[transformers]
```

## Sparse Transfer Learning onto SST2 (GLUE Task)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ SparseZoo contains pre-sparsified checkpoints of common NLP models like BERT and
Install via `pip`:

```
pip install sparseml[torch]
pip install sparseml[transformers]
```

## **Sparse Transfer Learning onto SST2**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ SparseZoo contains pre-sparsified checkpoints of common NLP models like BERT and
Install via `pip`:

```
pip install sparseml[torch]
pip install sparseml[transformers]
```

## **Example: Sparse Transfer Learning onto SST2**
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ In this tutorial, you will learn how to:
Install SparseML via `pip`:

```bash
pip install sparseml[torch]
pip install sparseml[transformers]
```

## SparseML CLI
Expand Down

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@
},
"outputs": [],
"source": [
"!pip install sparseml[torch]"
"!pip install sparseml[transformers]"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ In this tutorial, you will learn how to:
Install SparseML via `pip`:

```bash
pip install sparseml[torch]
pip install sparseml[transformers]
```

## Sparse Transfer Learning onto Conll2003
Expand Down
6 changes: 3 additions & 3 deletions integrations/torchvision/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ pip install sparseml[torchvision]

Neural Magic has pre-sparsified versions of common Torchvision models such as ResNet-50. These models can be deployed directly or can be fine-tuned onto custom dataset via sparse transfer learning. This makes it easy to create a sparse image classification model trained on your dataset.

[Check out the available models](https://sparsezoo.neuralmagic.com/?domain=cv&sub_domain=classification&page=1)
[Check out the available models](https://sparsezoo.neuralmagic.com/?useCase=classification)

### Recipes

Expand Down Expand Up @@ -104,7 +104,7 @@ sparseml.image_classification.train \

For full usage, run:
```bash
sparseml.image_classification --help
sparseml.image_classification.train --help
```

## Quick Start: Sparse Transfer Learning with the CLI
Expand All @@ -113,7 +113,7 @@ sparseml.image_classification --help

Sparse Transfer is quite similiar to the typical transfer learning process used to train image classification models, where we fine-tune a checkpoint pretrained on ImageNet onto a smaller downstream dataset. With Sparse Transfer Learning, we simply start the fine-tuning process from a pre-sparsified checkpoint and maintain sparsity while the training process occurs.

In this example, we will fine-tune a 95% pruned version of ResNet-50 ([available in SparseZoo](https://sparsezoo.neuralmagic.com/models/cv%2Fclassification%2Fresnet_v1-50%2Fpytorch%2Fsparseml%2Fimagenet%2Fpruned95_quant-none)) onto ImageNette.
In this example, we will fine-tune a 95% pruned version of ResNet-50 ([available in SparseZoo](https://sparsezoo.neuralmagic.com/models/resnet_v1-50-imagenet-pruned95_quantized?comparison=resnet_v1-50-imagenet-base)) onto ImageNette.

### Kick off Training

Expand Down
8 changes: 3 additions & 5 deletions integrations/ultralytics-yolov5/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,9 @@ Once trained, SparseML enables you to export models to the ONNX format, such tha
Install with `pip`:

```bash
pip install sparseml[torchvision]
pip install sparseml[yolov5]
```

**Note**: YOLOv5 will not immediately install with this command. Instead, a sparsification-compatible version of YOLOv5 will install on the first invocation of the YOLOv5 code in SparseML.

## Tutorials

- [Sparse Transfer Learning with the CLI](tutorials/sparse-transfer-learning.md) **[HIGHLY RECOMMENDED]**
Expand Down Expand Up @@ -91,15 +89,15 @@ SparseML inherits most arguments from the Ultralytics repository. [Check out the

Sparse Transfer is very similiar to the typical transfer learing process used to train YOLOv5 models, where we fine-tune a checkpoint pretrained on COCO onto a smaller downstream dataset. With Sparse Transfer Learning, however, we simply start the fine-tuning process from a pre-sparsified checkpoint and maintain sparsity while the training process occurs.

Here, we will fine-tune a [75% pruned-quantized version of YOLOv5s](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-s%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned75_quant-none) onto VOC.
Here, we will fine-tune a [75% pruned-quantized version of YOLOv5s](https://sparsezoo.neuralmagic.com/models/yolov5-s-coco-pruned75_quantized?comparison=yolov5-s-coco-base&tab=0) onto VOC.

### Kick off Training

We will use SparseML's `sparseml.yolov5.train` training script.

To run sparse transfer learning, we first need to create/select a sparsification recipe. For sparse transfer, we need a recipe that instructs SparseML to maintain sparsity during training and to quantize the model over the final epochs.

For the VOC dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-s%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned75_quant-none), identified by the following SparseZoo stub:
For the VOC dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/yolov5-s-coco-pruned75_quantized?comparison=yolov5-s-coco-base&tab=0), found under the recipes tab and identified by the following SparseZoo stub:
```bash
zoo:cv/detection/yolov5-s/pytorch/ultralytics/coco/pruned75_quant-none?recipe_type=transfer_learn
```
Expand Down
7 changes: 5 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,10 @@
"torchvision>=0.3.0,<0.15",
"torchaudio<=0.13",
]
_pytorch_vision_deps = _pytorch_deps + ["torchvision>=0.3.0,<0.15"]
_pytorch_vision_deps = _pytorch_deps + [
"torchvision>=0.3.0,<0.15",
"opencv-python<=4.6.0.66",
]
_transformers_deps = _pytorch_deps + [
f"{'nm-transformers' if is_release else 'nm-transformers-nightly'}"
f"~={version_nm_deps}",
Expand Down Expand Up @@ -251,7 +254,7 @@ def _setup_entry_points() -> Dict:
[
"sparseml.ultralytics.train=sparseml.yolov8.train:main",
"sparseml.ultralytics.val=sparseml.yolov8.val:main",
"sparseml.ultralytics.export=sparseml.yolov8.export:main",
"sparseml.ultralytics.export_onnx=sparseml.yolov8.export:main",
]
)

Expand Down

0 comments on commit 8d83570

Please sign in to comment.