Skip to content

Commit

Permalink
[Docs] Fix grammar and spelling mistakes in docs (open-mmlab#16)
Browse files Browse the repository at this point in the history
* fix docs

* fix template
  • Loading branch information
AllentDan authored Dec 24, 2021
1 parent 41b855b commit facac60
Show file tree
Hide file tree
Showing 8 changed files with 16 additions and 16 deletions.
2 changes: 1 addition & 1 deletion .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ If this PR introduces a new feature, it is better to list some use cases here an

- [ ] Pre-commit or other linting tools are used to fix the potential lint issues.
- [ ] Bug fixes are fully covered by unit tests, the case that causes the bug should be added in the unit tests.
- [ ] The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
- [ ] The modification is covered by complete unit tests. If not, please add more unit tests to ensure the correctness.
- [ ] The documentation has been modified accordingly, like docstring or example tutorials.

**After PR**:
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ It is a part of the [OpenMMLab](https://openmmlab.com/) project.
Major features:
- **Compatibility**

MMRazor can be easily applied to various projects in OpenMMLab, due to similar architecture design of OpenMMLab as well as the decoupling of slimming algorithms and vision tasks.
MMRazor can be easily applied to various projects in OpenMMLab, due to the similar architecture design of OpenMMLab as well as the decoupling of slimming algorithms and vision tasks.

- **Flexibility**

Expand Down
6 changes: 3 additions & 3 deletions docs/en/test.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ The usage of optional arguments are the same as corresponding tasks like mmclass
## Pruning

### Split Checkpoint(Optional)
If you train a slimmable model during retrain, checkpoints of different subnets are
If you train a slimmable model during retraining, checkpoints of different subnets are
actually fused in only one checkpoint. You can split this checkpoint to
multiple independent checkpoints by using following command
multiple independent checkpoints by using the following command

```bash
python tools/model_converters/split_checkpoint.py ${CONFIG_FILE} ${CHECKPOINT_PATH} --channel-cfgs ${CHANNEL_CFG_PATH} [optional arguments]
Expand All @@ -39,7 +39,7 @@ python tools/${task}/test_${task}.py ${CONFIG_FILE} ${CHECKPOINT_PATH} --cfg-opt

## Distillation

To test distillation method, you can use following command
To test distillation method, you can use the following command

```bash
python tools/${task}/test_${task}.py ${CONFIG_FILE} ${CHECKPOINT_PATH} [optional arguments]
Expand Down
2 changes: 1 addition & 1 deletion docs/en/train.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ python tools/${task}/train_${task}.py ${CONFIG_FILE} --cfg-options algorithm.mut

## Pruning

Pruning has the four steps, including **supernet pre-training**, **search for subnet on the trained supernet**, **subnet retraining** and **split checkpoint**. The command of first two steps are similar to NAS, except here we need to use `CONFIG_FILE` of Pruning. Commands of two other steps are as follows.
Pruning has four steps, including **supernet pre-training**, **search for subnet on the trained supernet**, **subnet retraining** and **split checkpoint**. The commands of the first two steps are similar to NAS, except that we need to use `CONFIG_FILE` of Pruning here. The commands of the two other steps are as follows.

### Subnet Retraining

Expand Down
10 changes: 5 additions & 5 deletions docs/en/tutorials/Tutorial_1_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ It is a part of the [OpenMMLab](https://openmmlab.com/) project.
## Major features:
- **Compatibility**

MMRazor can be easily applied to various projects in OpenMMLab, due to similar architecture design of OpenMMLab as well as the decoupling of slimming algorithms and vision tasks.
MMRazor can be easily applied to various projects in OpenMMLab, due to the similar architecture design of OpenMMLab as well as the decoupling of slimming algorithms and vision tasks.

- **Flexibility**

Expand All @@ -26,11 +26,11 @@ It is a part of the [OpenMMLab](https://openmmlab.com/) project.

![design_and_implement](../imgs/tutorials/overview/design_and_implement.png)

In terms of overall design, MMRazor mainly includes Component and Algorithm.
In terms of the overall design, MMRazor mainly includes Component and Algorithm.

Component can be divided into basic component and algorithm component. Basic
Component can be divided into basic component and algorithm component. The basic
component consists of searcher, OP, Mutables and other modules in the figure,
which provide basic function support for algorithm component. Algorithm component
which provides basic function support for algorithm component. Algorithm component
consists of Mutator, Pruner, Distiller and other modules in the figure. They
provide core functionality for implementing various lightweight algorithms.
The combination of Algorithm and Application can realize the purpose of slimming
Expand Down Expand Up @@ -63,7 +63,7 @@ MMRazor consists of 4 main parts: 1) apis, 2) core, 3) models, 4) datasets. mode

- **Mutator**: Core functions provider of different types of NAS, mainly include some functions of changing the structure of architecture.
- **Pruner**: Core functions provider of different types of pruning, mainly includes some functions of changing the structure of architecture and getting channel group.
- **Distiller**: Core functions provider of different types of KD, mainly includes functions of registering some forward hooks, calculate the kd-loss and so on.
- **Distiller**: Core functions provider of different types of KD, mainly includes functions of registering some forward hooks, calculate the kd-loss and so on.
- **Quantizer**: Core functions provider of different types of quantization. It will come soon.

- **Base components**: Core components of architecture and some algorithm components
Expand Down
2 changes: 1 addition & 1 deletion docs/en/tutorials/Tutorial_2_learn_about_configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ We follow the below convention to name config files. Contributors are advised to

Same as MMDetection, we incorporate modular and inheritance design into our config system, which is convenient to conduct various experiments.

To help the users have a basic idea of a complete config and the modules in a generation system, we make brief comments on the configs of some examples as the following. For more detailed usage and the corresponding alternative for each modules, please refer to the API documentation.
To help the users have a basic idea of a complete config and the modules in a generation system, we make brief comments on the configs of some examples as the following. For more detailed usage and the corresponding alternative for each module, please refer to the API documentation.

### An example of NAS - spos

Expand Down
2 changes: 1 addition & 1 deletion docs/en/tutorials/Tutorial_3_customize_architectures.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Here we show how to add a new searchable backbone with an example of searchable_

1. Define a new backbone

Create a new file `mmrazor/models/architectures/components/backbones/searchable_shufflenet_v2.py`, class `SearchableShuffleNetV2` inherits from `BaseBackBone` of mmcls, which is the codebase that you will to build model.
Create a new file `mmrazor/models/architectures/components/backbones/searchable_shufflenet_v2.py`, class `SearchableShuffleNetV2` inherits from `BaseBackBone` of mmcls, which is the codebase that you will to build the model.

```python
import torch.nn as nn
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Tutorial 7: Customize mixed algorithms with our algorithm components

Here we show how to customize mixed algorithms with our algorithm components. We take the slimmable training in autoslim as an example.
Here we show how to customize mixed algorithms with our algorithm components. We take the slimmable training in autoslim as an example.

The sandwich rule and inplace distillation was introduced to enhance training process. The sandwich rule means that we train the model at smallest width, largest width and (n − 2) random widths, instead of n random widths. By inplace distillation, we use the predicted label of the model at the largest width as the training label for other widths, while for the largest width we use ground truth. So both the KD algorithm and the pruning algorithm are used in slimmable training.
The sandwich rule and inplace distillation were introduced to enhance the training process. The sandwich rule means that we train the model at smallest width, largest width and (n − 2) random widths, instead of n random widths. By inplace distillation, we use the predicted label of the model at the largest width as the training label for other widths, while for the largest width we use ground truth. So both the KD algorithm and the pruning algorithm are used in slimmable training.

1. In distillation part, we can directly use SelfDistiller in `mmrazor/models/distillers/self_distiller.py`. If distillers provided in MMRazor don't meet your needs, you can develop new algorithm components for your algorithm as step2 in Tutorial 6.
1. In the distillation part, we can directly use SelfDistiller in `mmrazor/models/distillers/self_distiller.py`. If distillers provided in MMRazor don't meet your needs, you can develop new algorithm components for your algorithm as step2 in Tutorial 6.

2. As the slimmable training is the first step of `Autoslim`, we do not need to register a new algorithm, but rewrite the `train_step`function in AutoSlim as follows:

Expand Down

0 comments on commit facac60

Please sign in to comment.