Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the docs to reflect the variable_setup #1424

Merged
merged 6 commits into from
Aug 28, 2023
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 35 additions & 1 deletion documentation/source/configuration_files.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,14 +132,15 @@ In the experiment directory a `.hydra` subdirectory will be created. The configu
Two Hydra features worth mentioning are _YAML Composition_ and _Command-Line Overrides_.

#### YAML Composition
If you brows the YAML files in the `recipes` directory you will see some file containing the saved-key `defaults:` at the beginning of the file.
If you browse the YAML files in the `recipes` directory you will see some file containing the saved-key `defaults:` at the beginning of the file.
```yaml
defaults:
- training_hyperparams: cifar10_resnet_train_params
- dataset_params: cifar10_dataset_params
- arch_params: resnet18_cifar_arch_params
- checkpoint_params: default_checkpoint_params
- _self_
- variable_setup

```
The YAML file containing this header will inherit the configuration of the above files. So when building a training recipe, one can structure
Expand All @@ -161,6 +162,39 @@ initial learning-rate. This feature is extremely usefully when experimenting wit
Note that the arguments are referenced without the `--` prefix and that each parameter is referenced with its full path in the
configuration tree, concatenated with a `.`.

##### Command-Line Override Shortcuts

Although you can override any parameter from the command line, writing the full path of the parameter can be tedious.
For example, to change the learning rate one would have to write `training_hyperparams.initial_lr=0.02`.
To change the batch size one would have to write
`dataset_params.train_dataloader_params.batch_size=128 dataset_params.val_dataloader_params.batch_size=128`.

To make it easier, we have defined a few shortcuts for the most common parameters that aims to reduce the amount of typing required:

* Learning rate: `lr=0.02` (same as `training_hyperparams.initial_lr=0.02`)
* Batch size: `bs=128` (same as `dataset_params.train_dataloader_params.batch_size=128 dataset_params.val_dataloader_params.batch_size=128`)
* Number of train epochs: `epochs=100` (same as `training_hyperparams.max_epochs=100`)
* Number of workers: `num_workers=4` (same as `dataset_params.train_dataloader_params.num_workers=4 dataset_params.val_dataloader_params.num_workers=4`)
* Resume training for a specific experiment: `resume=True` (same as `training_hyperparams.resume=True`)
* Enable or disable EMA: `ema=true` (same as `training_hyperparams.ema=true`)

To use these shortcuts, a `variable_setup` section should be a part of hydra defaults in the recipe file.
Please note it `variable_setup` **must be the last item** in the defaults list:

```yaml
defaults:
- training_hyperparams: cifar10_resnet_train_params
- dataset_params: cifar10_dataset_params
- arch_params: resnet18_cifar_arch_params
- checkpoint_params: default_checkpoint_params
- _self_
- variable_setup

...
```

More informaiton can be found in corresponding YAML file in the `recipes` directory:
https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/recipes/variable_setup.yaml

## Resolvers
Resolvers are converting the strings from the YAML file into Python objects or values. The most basic resolvers are the Hydra native resolvers.
Expand Down