From 5cfb9952c9c037557f501bf59bdb997dba9ca1ef Mon Sep 17 00:00:00 2001 From: UltralyticsAssistant Date: Wed, 10 Jan 2024 08:35:11 +0000 Subject: [PATCH 1/2] Auto-format by Ultralytics actions --- utils/flask_rest_api/README.md | 7 ++----- utils/loggers/clearml/README.md | 23 ++++++++++------------- utils/loggers/comet/README.md | 26 +++++++++----------------- 3 files changed, 21 insertions(+), 35 deletions(-) diff --git a/utils/flask_rest_api/README.md b/utils/flask_rest_api/README.md index a726acbd9204..b18a3011cf32 100644 --- a/utils/flask_rest_api/README.md +++ b/utils/flask_rest_api/README.md @@ -1,8 +1,6 @@ # Flask REST API -[REST](https://en.wikipedia.org/wiki/Representational_state_transfer) [API](https://en.wikipedia.org/wiki/API)s are -commonly used to expose Machine Learning (ML) models to other services. This folder contains an example REST API -created using Flask to expose the YOLOv5s model from [PyTorch Hub](https://pytorch.org/hub/ultralytics_yolov5/). +[REST](https://en.wikipedia.org/wiki/Representational_state_transfer) [API](https://en.wikipedia.org/wiki/API)s are commonly used to expose Machine Learning (ML) models to other services. This folder contains an example REST API created using Flask to expose the YOLOv5s model from [PyTorch Hub](https://pytorch.org/hub/ultralytics_yolov5/). ## Requirements @@ -69,5 +67,4 @@ The model inference results are returned as a JSON response: ] ``` -An example python script to perform inference using [requests](https://docs.python-requests.org/en/master/) is given -in `example_request.py` +An example python script to perform inference using [requests](https://docs.python-requests.org/en/master/) is given in `example_request.py` diff --git a/utils/loggers/clearml/README.md b/utils/loggers/clearml/README.md index ca41c040193c..bc40919ab0ea 100644 --- a/utils/loggers/clearml/README.md +++ b/utils/loggers/clearml/README.md @@ -34,15 +34,15 @@ Either sign up for free to the [ClearML Hosted Service](https://cutt.ly/yolov5-t 1. Install the `clearml` python package: - ```bash - pip install clearml - ``` + ```bash + pip install clearml + ``` -1. Connect the ClearML SDK to the server by [creating credentials](https://app.clear.ml/settings/workspace-configuration) (go right top to Settings -> Workspace -> Create new credentials), then execute the command below and follow the instructions: +2. Connect the ClearML SDK to the server by [creating credentials](https://app.clear.ml/settings/workspace-configuration) (go right top to Settings -> Workspace -> Create new credentials), then execute the command below and follow the instructions: - ```bash - clearml-init - ``` + ```bash + clearml-init + ``` That's it! You're done 😎 @@ -58,8 +58,7 @@ pip install clearml>=1.2.0 This will enable integration with the YOLOv5 training script. Every training run from now on, will be captured and stored by the ClearML experiment manager. -If you want to change the `project_name` or `task_name`, use the `--project` and `--name` arguments of the `train.py` script, by default the project will be called `YOLOv5` and the task `Training`. -PLEASE NOTE: ClearML uses `/` as a delimiter for subprojects, so be careful when using `/` in your project name! +If you want to change the `project_name` or `task_name`, use the `--project` and `--name` arguments of the `train.py` script, by default the project will be called `YOLOv5` and the task `Training`. PLEASE NOTE: ClearML uses `/` as a delimiter for subprojects, so be careful when using `/` in your project name! ```bash python train.py --img 640 --batch 16 --epochs 3 --data coco128.yaml --weights yolov5s.pt --cache @@ -86,8 +85,7 @@ This will capture: - Validation images per epoch - ... -That's a lot right? 🤯 -Now, we can visualize all of this information in the ClearML UI to get an overview of our training progress. Add custom columns to the table view (such as e.g. mAP_0.5) so you can easily sort on the best performing model. Or select multiple experiments and directly compare them! +That's a lot right? 🤯 Now, we can visualize all of this information in the ClearML UI to get an overview of our training progress. Add custom columns to the table view (such as e.g. mAP_0.5) so you can easily sort on the best performing model. Or select multiple experiments and directly compare them! There even more we can do with all of this information, like hyperparameter optimization and remote execution, so keep reading if you want to see how that works! @@ -181,8 +179,7 @@ python utils/loggers/clearml/hpo.py ## 🤯 Remote Execution (advanced) -Running HPO locally is really handy, but what if we want to run our experiments on a remote machine instead? Maybe you have access to a very powerful GPU machine on-site, or you have some budget to use cloud GPUs. -This is where the ClearML Agent comes into play. Check out what the agent can do here: +Running HPO locally is really handy, but what if we want to run our experiments on a remote machine instead? Maybe you have access to a very powerful GPU machine on-site, or you have some budget to use cloud GPUs. This is where the ClearML Agent comes into play. Check out what the agent can do here: - [YouTube video](https://youtu.be/MX3BrXnaULs) - [Documentation](https://clear.ml/docs/latest/docs/clearml_agent) diff --git a/utils/loggers/comet/README.md b/utils/loggers/comet/README.md index 3ad52b01b4e9..52f344dba684 100644 --- a/utils/loggers/comet/README.md +++ b/utils/loggers/comet/README.md @@ -8,8 +8,7 @@ This guide will cover how to use YOLOv5 with [Comet](https://bit.ly/yolov5-readm Comet builds tools that help data scientists, engineers, and team leaders accelerate and optimize machine learning and deep learning models. -Track and visualize model metrics in real time, save your hyperparameters, datasets, and model checkpoints, and visualize your model predictions with [Comet Custom Panels](https://www.comet.com/docs/v2/guides/comet-dashboard/code-panels/about-panels/?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=github)! -Comet makes sure you never lose track of your work and makes it easy to share results and collaborate across teams of all sizes! +Track and visualize model metrics in real time, save your hyperparameters, datasets, and model checkpoints, and visualize your model predictions with [Comet Custom Panels](https://www.comet.com/docs/v2/guides/comet-dashboard/code-panels/about-panels/?utm_source=yolov5&utm_medium=partner&utm_campaign=partner_yolov5_2022&utm_content=github)! Comet makes sure you never lose track of your work and makes it easy to share results and collaborate across teams of all sizes! # Getting Started @@ -84,8 +83,7 @@ By default, Comet will log the following items # Configure Comet Logging -Comet can be configured to log additional data either through command line flags passed to the training script -or through environment variables. +Comet can be configured to log additional data either through command line flags passed to the training script or through environment variables. ```shell export COMET_MODE=online # Set whether to run Comet in 'online' or 'offline' mode. Defaults to online @@ -100,8 +98,7 @@ export COMET_LOG_PREDICTIONS=true # Set this to false to disable logging model p ## Logging Checkpoints with Comet -Logging Models to Comet is disabled by default. To enable it, pass the `save-period` argument to the training script. This will save the -logged checkpoints to Comet based on the interval value provided by `save-period` +Logging Models to Comet is disabled by default. To enable it, pass the `save-period` argument to the training script. This will save the logged checkpoints to Comet based on the interval value provided by `save-period` ```shell python train.py \ @@ -176,14 +173,11 @@ python train.py \ --upload_dataset ``` -You can find the uploaded dataset in the Artifacts tab in your Comet Workspace -artifact-1 +You can find the uploaded dataset in the Artifacts tab in your Comet Workspace artifact-1 -You can preview the data directly in the Comet UI. -artifact-2 +You can preview the data directly in the Comet UI. artifact-2 -Artifacts are versioned and also support adding metadata about the dataset. Comet will automatically log the metadata from your dataset `yaml` file -artifact-3 +Artifacts are versioned and also support adding metadata about the dataset. Comet will automatically log the metadata from your dataset `yaml` file artifact-3 ### Using a saved Artifact @@ -205,8 +199,7 @@ python train.py \ --weights yolov5s.pt ``` -Artifacts also allow you to track the lineage of data as it flows through your Experimentation workflow. Here you can see a graph that shows you all the experiments that have used your uploaded dataset. -artifact-4 +Artifacts also allow you to track the lineage of data as it flows through your Experimentation workflow. Here you can see a graph that shows you all the experiments that have used your uploaded dataset. artifact-4 ## Resuming a Training Run @@ -214,7 +207,7 @@ If your training run is interrupted for any reason, e.g. disrupted internet conn The Run Path has the following format `comet:////`. -This will restore the run to its state before the interruption, which includes restoring the model from a checkpoint, restoring all hyperparameters and training arguments and downloading Comet dataset Artifacts if they were used in the original run. The resumed run will continue logging to the existing Experiment in the Comet UI +This will restore the run to its state before the interruption, which includes restoring the model from a checkpoint, restoring all hyperparameters and training arguments and downloading Comet dataset Artifacts if they were used in the original run. The resumed run will continue logging to the existing Experiment in the Comet UI ```shell python train.py \ @@ -234,8 +227,7 @@ python utils/loggers/comet/hpo.py \ --comet_optimizer_config "utils/loggers/comet/optimizer_config.json" ``` -The `hpo.py` script accepts the same arguments as `train.py`. If you wish to pass additional arguments to your sweep simply add them after -the script. +The `hpo.py` script accepts the same arguments as `train.py`. If you wish to pass additional arguments to your sweep simply add them after the script. ```shell python utils/loggers/comet/hpo.py \ From 7f51632258308b96ce0aae77751125394c439a51 Mon Sep 17 00:00:00 2001 From: RizwanMunawar Date: Wed, 10 Jan 2024 13:36:47 +0500 Subject: [PATCH 2/2] updated git banner --- README.md | 4 ++-- README.zh-CN.md | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index c778a17258e9..c34668463697 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@