Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New model.yaml activation: field #9371

Merged
merged 10 commits into from
Sep 15, 2022
Merged

New model.yaml activation: field #9371

merged 10 commits into from
Sep 15, 2022

Conversation

glenn-jocher
Copy link
Member

@glenn-jocher glenn-jocher commented Sep 11, 2022

Add optional model yaml activation field to define model-wide activations, i.e.:

activation: nn.LeakyReLU(0.1)  # activation with arguments
activation: nn.SiLU()  # activation with no arguments

Signed-off-by: Glenn Jocher glenn.jocher@ultralytics.com

πŸ› οΈ PR Summary

Made with ❀️ by Ultralytics Actions

🌟 Summary

Enhanced flexibility in activation functions and depth-wise convolutions in YOLOv5 models.

πŸ“Š Key Changes

  • Introduced a preset default activation function (SiLU) for the Conv class.
  • Modified the DWConv class to support dilation, making it more versatile.
  • Added a new YOLOv5s configuration file utilizing the LeakyReLU activation function.
  • Implemented the ability to set a custom activation function globally for YOLOv5 models through the model configuration YAML.

🎯 Purpose & Impact

  • πŸŽ› The default SiLU activation simplifies the construction of Conv layers by establishing a consistently used non-linearity.
  • βš™οΈ Depth-wise convolutions now have the added flexibility of specifying dilation, thereby enhancing the ability to control feature map resolution during the learning process.
  • πŸ”Œ The introduction of a new YOLOv5s model using LeakyReLU provides users with a pre-configured alternative if it better suits their use case.
  • πŸ“ With the update, developers can easily swap activation functions across the entire model, promoting experimentation and potentially leading to performance improvements.
  • πŸ‘©β€πŸ’» Users benefit from increased model customization abilities, potentially leading to better accuracy, training speed, or inference efficiency, depending on the task at hand.

Add optional model yaml activation field to define model-wide activations, i.e.:

```yaml
activation: nn.LeakyReLU(0.1)  # activation with arguments
activation: nn.SiLU()  # activation with no arguments
```

Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
@glenn-jocher
Copy link
Member Author

Refer to ultralytics/hub#97

@glenn-jocher
Copy link
Member Author

glenn-jocher commented Sep 15, 2022

@AyushExel tracking YOLOv5m DetectionModel activation study in https://wandb.ai/glenn-jocher/YOLOv5m-study-activations

Note yolov5m-SiLU is lagging on CUDA device 5 unfortunately, runs about half as fast as other devices.

@glenn-jocher glenn-jocher merged commit a1e5f9a into master Sep 15, 2022
@glenn-jocher glenn-jocher deleted the update/yaml_activation branch September 15, 2022 22:55
@glenn-jocher glenn-jocher mentioned this pull request Sep 15, 2022
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant