From ec8c3ff8d1dfa9864fbde5fd6f909f1a44833a9f Mon Sep 17 00:00:00 2001 From: Glenn Jocher Date: Fri, 12 Jan 2024 18:33:25 +0100 Subject: [PATCH] Update HTTP to HTTPS (#7548) Signed-off-by: Glenn Jocher --- README.md | 6 ++--- README.zh-CN.md | 6 ++--- docs/ar/tasks/detect.md | 2 +- docs/ar/tasks/pose.md | 2 +- docs/ar/tasks/segment.md | 2 +- docs/de/tasks/detect.md | 2 +- docs/de/tasks/pose.md | 2 +- docs/de/tasks/segment.md | 2 +- docs/en/datasets/classify/imagenet.md | 2 +- docs/en/datasets/classify/imagenette.md | 2 +- docs/en/datasets/detect/globalwheat2020.md | 4 ++-- docs/en/robots.txt | 22 +++++++++---------- docs/en/tasks/detect.md | 2 +- docs/en/tasks/pose.md | 2 +- docs/en/tasks/segment.md | 2 +- .../tips_for_best_training_results.md | 2 +- docs/en/yolov5/tutorials/train_custom_data.md | 2 +- docs/es/tasks/detect.md | 2 +- docs/es/tasks/pose.md | 2 +- docs/es/tasks/segment.md | 2 +- docs/fr/tasks/detect.md | 2 +- docs/fr/tasks/pose.md | 2 +- docs/fr/tasks/segment.md | 2 +- docs/hi/tasks/detect.md | 2 +- docs/hi/tasks/pose.md | 2 +- docs/hi/tasks/segment.md | 2 +- docs/ja/tasks/detect.md | 2 +- docs/ja/tasks/pose.md | 2 +- docs/ja/tasks/segment.md | 2 +- docs/ko/tasks/detect.md | 2 +- docs/ko/tasks/pose.md | 2 +- docs/ko/tasks/segment.md | 2 +- docs/pt/tasks/detect.md | 2 +- docs/pt/tasks/pose.md | 2 +- docs/pt/tasks/segment.md | 2 +- docs/ru/tasks/detect.md | 2 +- docs/ru/tasks/pose.md | 2 +- docs/ru/tasks/segment.md | 2 +- docs/zh/tasks/detect.md | 2 +- docs/zh/tasks/pose.md | 2 +- docs/zh/tasks/segment.md | 2 +- tests/test_python.py | 2 +- ultralytics/cfg/datasets/Argoverse.yaml | 2 +- ultralytics/cfg/datasets/GlobalWheat2020.yaml | 2 +- ultralytics/cfg/datasets/coco-pose.yaml | 2 +- ultralytics/cfg/datasets/coco.yaml | 2 +- ultralytics/data/scripts/get_coco.sh | 2 +- 47 files changed, 62 insertions(+), 62 deletions(-) diff --git a/README.md b/README.md index 365113131b8..e20897ee10e 100644 --- a/README.md +++ b/README.md @@ -131,7 +131,7 @@ See [Detection Docs](https://docs.ultralytics.com/tasks/detect/) for usage examp | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val detect data=coco.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val detect data=coco.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val detect data=coco.yaml batch=1 device=0|cpu` @@ -165,7 +165,7 @@ See [Segmentation Docs](https://docs.ultralytics.com/tasks/segment/) for usage e | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val segment data=coco-seg.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val segment data=coco-seg.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val segment data=coco-seg.yaml batch=1 device=0|cpu` @@ -183,7 +183,7 @@ See [Pose Docs](https://docs.ultralytics.com/tasks/pose/) for usage examples wit | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** values are for single-model single-scale on [COCO Keypoints val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val pose data=coco-pose.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO Keypoints val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val pose data=coco-pose.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val pose data=coco-pose.yaml batch=1 device=0|cpu` diff --git a/README.zh-CN.md b/README.zh-CN.md index eec0a468fd5..36a655175ff 100644 --- a/README.zh-CN.md +++ b/README.zh-CN.md @@ -133,7 +133,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟 | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** 值是基于单模型单尺度在 [COCO val2017](http://cocodataset.org) 数据集上的结果。
通过 `yolo val detect data=coco.yaml device=0` 复现 +- **mAPval** 值是基于单模型单尺度在 [COCO val2017](https://cocodataset.org) 数据集上的结果。
通过 `yolo val detect data=coco.yaml device=0` 复现 - **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。
通过 `yolo val detect data=coco.yaml batch=1 device=0|cpu` 复现 @@ -167,7 +167,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟 | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** 值是基于单模型单尺度在 [COCO val2017](http://cocodataset.org) 数据集上的结果。
通过 `yolo val segment data=coco-seg.yaml device=0` 复现 +- **mAPval** 值是基于单模型单尺度在 [COCO val2017](https://cocodataset.org) 数据集上的结果。
通过 `yolo val segment data=coco-seg.yaml device=0` 复现 - **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。
通过 `yolo val segment data=coco-seg.yaml batch=1 device=0|cpu` 复现 @@ -185,7 +185,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟 | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** 值是基于单模型单尺度在 [COCO Keypoints val2017](http://cocodataset.org) 数据集上的结果。
通过 `yolo val pose data=coco-pose.yaml device=0` 复现 +- **mAPval** 值是基于单模型单尺度在 [COCO Keypoints val2017](https://cocodataset.org) 数据集上的结果。
通过 `yolo val pose data=coco-pose.yaml device=0` 复现 - **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。
通过 `yolo val pose data=coco-pose.yaml batch=1 device=0|cpu` 复现 diff --git a/docs/ar/tasks/detect.md b/docs/ar/tasks/detect.md index 7410b7c9e9d..9e5974d342d 100644 --- a/docs/ar/tasks/detect.md +++ b/docs/ar/tasks/detect.md @@ -41,7 +41,7 @@ Task التعرف على الكائنات هو عبارة عن تعرف على | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- قيم mAPval تنطبق على مقياس نموذج واحد-مقياس واحد على مجموعة بيانات [COCO val2017](http://cocodataset.org). +- قيم mAPval تنطبق على مقياس نموذج واحد-مقياس واحد على مجموعة بيانات [COCO val2017](https://cocodataset.org).
اعيد حسابها بواسطة `yolo val detect data=coco.yaml device=0` - السرعةتمت متوسطة على صور COCO val باستخدام [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. diff --git a/docs/ar/tasks/pose.md b/docs/ar/tasks/pose.md index b37ad1f41d4..7d5627fa8d9 100644 --- a/docs/ar/tasks/pose.md +++ b/docs/ar/tasks/pose.md @@ -40,7 +40,7 @@ keywords: Ultralytics، YOLO، YOLOv8، تقدير الوضعية ، كشف نق | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- تعتبر القيم **mAPval** لنموذج واحد ومقياس واحد فقط على [COCO Keypoints val2017](http://cocodataset.org) +- تعتبر القيم **mAPval** لنموذج واحد ومقياس واحد فقط على [COCO Keypoints val2017](https://cocodataset.org) مجموعة البيانات.
يمكن إعادة إنتاجه بواسطة `يولو val pose data=coco-pose.yaml device=0` - يتم حساب **السرعة** من خلال متوسط صور COCO val باستخدام [المروحة الحرارية Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) diff --git a/docs/ar/tasks/segment.md b/docs/ar/tasks/segment.md index 2cd14cbce7a..9a81e6bc3fb 100644 --- a/docs/ar/tasks/segment.md +++ b/docs/ar/tasks/segment.md @@ -41,7 +41,7 @@ keywords: yolov8 ، فصل الأشكال الفردية ، Ultralytics ، مج | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- تُستخدم قيم **mAPval** لنموذج واحد وحجم واحد على مجموعة بيانات [COCO val2017](http://cocodataset.org). +- تُستخدم قيم **mAPval** لنموذج واحد وحجم واحد على مجموعة بيانات [COCO val2017](https://cocodataset.org).
يمكن إعادة إنتاجها باستخدام `yolo val segment data=coco.yaml device=0` - **تُحسب السرعة** كمتوسط على صور COCO val باستخدام [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. diff --git a/docs/de/tasks/detect.md b/docs/de/tasks/detect.md index 151335ec142..7ebd7a67f02 100644 --- a/docs/de/tasks/detect.md +++ b/docs/de/tasks/detect.md @@ -41,7 +41,7 @@ Hier werden die vortrainierten YOLOv8 Detect Modelle gezeigt. Detect, Segment un | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** Werte sind für Single-Modell Single-Scale auf dem [COCO val2017](http://cocodataset.org) Datensatz. +- **mAPval** Werte sind für Single-Modell Single-Scale auf dem [COCO val2017](https://cocodataset.org) Datensatz.
Reproduzieren mit `yolo val detect data=coco.yaml device=0` - **Geschwindigkeit** gemittelt über COCO Val Bilder mit einer [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)-Instanz.
Reproduzieren mit `yolo val detect data=coco128.yaml batch=1 device=0|cpu` diff --git a/docs/de/tasks/pose.md b/docs/de/tasks/pose.md index b9d5ffae500..be2905852da 100644 --- a/docs/de/tasks/pose.md +++ b/docs/de/tasks/pose.md @@ -42,7 +42,7 @@ Hier werden vortrainierte YOLOv8 Pose-Modelle gezeigt. Erkennungs-, Segmentierun | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69,2 | 90,2 | 1607,1 | 3,73 | 69,4 | 263,2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71,6 | 91,2 | 4088,7 | 10,04 | 99,1 | 1066,4 | -- **mAPval** Werte gelten für ein einzelnes Modell mit einfacher Skala auf dem [COCO Keypoints val2017](http://cocodataset.org)-Datensatz. +- **mAPval** Werte gelten für ein einzelnes Modell mit einfacher Skala auf dem [COCO Keypoints val2017](https://cocodataset.org)-Datensatz.
Zu reproduzieren mit `yolo val pose data=coco-pose.yaml device=0`. - **Geschwindigkeit** gemittelt über COCO-Validierungsbilder mit einer [Amazon EC2 P4d](https://aws.amazon.com/de/ec2/instance-types/p4/)-Instanz.
Zu reproduzieren mit `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`. diff --git a/docs/de/tasks/segment.md b/docs/de/tasks/segment.md index 3f476accc0e..36638b62162 100644 --- a/docs/de/tasks/segment.md +++ b/docs/de/tasks/segment.md @@ -41,7 +41,7 @@ Hier werden vortrainierte YOLOv8 Segment-Modelle gezeigt. Detect-, Segment- und | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- Die **mAPval**-Werte sind für ein einzelnes Modell, einzelne Skala auf dem [COCO val2017](http://cocodataset.org)-Datensatz. +- Die **mAPval**-Werte sind für ein einzelnes Modell, einzelne Skala auf dem [COCO val2017](https://cocodataset.org)-Datensatz.
Zum Reproduzieren nutzen Sie `yolo val segment data=coco.yaml device=0` - Die **Geschwindigkeit** ist über die COCO-Validierungsbilder gemittelt und verwendet eine [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)-Instanz.
Zum Reproduzieren `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` diff --git a/docs/en/datasets/classify/imagenet.md b/docs/en/datasets/classify/imagenet.md index be4ca2b49fb..6541c07031e 100644 --- a/docs/en/datasets/classify/imagenet.md +++ b/docs/en/datasets/classify/imagenet.md @@ -21,7 +21,7 @@ The ImageNet dataset is organized using the WordNet hierarchy. Each node in the ## ImageNet Large Scale Visual Recognition Challenge (ILSVRC) -The annual [ImageNet Large Scale Visual Recognition Challenge (ILSVRC)](http://image-net.org/challenges/LSVRC/) has been an important event in the field of computer vision. It has provided a platform for researchers and developers to evaluate their algorithms and models on a large-scale dataset with standardized evaluation metrics. The ILSVRC has led to significant advancements in the development of deep learning models for image classification, object detection, and other computer vision tasks. +The annual [ImageNet Large Scale Visual Recognition Challenge (ILSVRC)](https://image-net.org/challenges/LSVRC/) has been an important event in the field of computer vision. It has provided a platform for researchers and developers to evaluate their algorithms and models on a large-scale dataset with standardized evaluation metrics. The ILSVRC has led to significant advancements in the development of deep learning models for image classification, object detection, and other computer vision tasks. ## Applications diff --git a/docs/en/datasets/classify/imagenette.md b/docs/en/datasets/classify/imagenette.md index a43778f4750..df34c509662 100644 --- a/docs/en/datasets/classify/imagenette.md +++ b/docs/en/datasets/classify/imagenette.md @@ -6,7 +6,7 @@ keywords: ImageNette dataset, Ultralytics, YOLO, Image classification, Machine L # ImageNette Dataset -The [ImageNette](https://github.com/fastai/imagenette) dataset is a subset of the larger [Imagenet](http://www.image-net.org/) dataset, but it only includes 10 easily distinguishable classes. It was created to provide a quicker, easier-to-use version of Imagenet for software development and education. +The [ImageNette](https://github.com/fastai/imagenette) dataset is a subset of the larger [Imagenet](https://www.image-net.org/) dataset, but it only includes 10 easily distinguishable classes. It was created to provide a quicker, easier-to-use version of Imagenet for software development and education. ## Key Features diff --git a/docs/en/datasets/detect/globalwheat2020.md b/docs/en/datasets/detect/globalwheat2020.md index 8df4f65f6a5..7936ef2c141 100644 --- a/docs/en/datasets/detect/globalwheat2020.md +++ b/docs/en/datasets/detect/globalwheat2020.md @@ -6,7 +6,7 @@ keywords: Ultralytics, YOLO, Global Wheat Head Dataset, wheat head detection, pl # Global Wheat Head Dataset -The [Global Wheat Head Dataset](http://www.global-wheat.com/) is a collection of images designed to support the development of accurate wheat head detection models for applications in wheat phenotyping and crop management. Wheat heads, also known as spikes, are the grain-bearing parts of the wheat plant. Accurate estimation of wheat head density and size is essential for assessing crop health, maturity, and yield potential. The dataset, created by a collaboration of nine research institutes from seven countries, covers multiple growing regions to ensure models generalize well across different environments. +The [Global Wheat Head Dataset](https://www.global-wheat.com/) is a collection of images designed to support the development of accurate wheat head detection models for applications in wheat phenotyping and crop management. Wheat heads, also known as spikes, are the grain-bearing parts of the wheat plant. Accurate estimation of wheat head density and size is essential for assessing crop health, maturity, and yield potential. The dataset, created by a collaboration of nine research institutes from seven countries, covers multiple growing regions to ensure models generalize well across different environments. ## Key Features @@ -88,4 +88,4 @@ If you use the Global Wheat Head Dataset in your research or development work, p } ``` -We would like to acknowledge the researchers and institutions that contributed to the creation and maintenance of the Global Wheat Head Dataset as a valuable resource for the plant phenotyping and crop management research community. For more information about the dataset and its creators, visit the [Global Wheat Head Dataset website](http://www.global-wheat.com/). +We would like to acknowledge the researchers and institutions that contributed to the creation and maintenance of the Global Wheat Head Dataset as a valuable resource for the plant phenotyping and crop management research community. For more information about the dataset and its creators, visit the [Global Wheat Head Dataset website](https://www.global-wheat.com/). diff --git a/docs/en/robots.txt b/docs/en/robots.txt index 6d80eaeb3b4..c2f72fc57be 100644 --- a/docs/en/robots.txt +++ b/docs/en/robots.txt @@ -1,12 +1,12 @@ User-agent: * -Sitemap: http://docs.ultralytics.com/sitemap.xml -Sitemap: http://docs.ultralytics.com/ar/sitemap.xml -Sitemap: http://docs.ultralytics.com/de/sitemap.xml -Sitemap: http://docs.ultralytics.com/es/sitemap.xml -Sitemap: http://docs.ultralytics.com/fr/sitemap.xml -Sitemap: http://docs.ultralytics.com/hi/sitemap.xml -Sitemap: http://docs.ultralytics.com/ja/sitemap.xml -Sitemap: http://docs.ultralytics.com/ko/sitemap.xml -Sitemap: http://docs.ultralytics.com/pt/sitemap.xml -Sitemap: http://docs.ultralytics.com/ru/sitemap.xml -Sitemap: http://docs.ultralytics.com/zh/sitemap.xml +Sitemap: https://docs.ultralytics.com/sitemap.xml +Sitemap: https://docs.ultralytics.com/ar/sitemap.xml +Sitemap: https://docs.ultralytics.com/de/sitemap.xml +Sitemap: https://docs.ultralytics.com/es/sitemap.xml +Sitemap: https://docs.ultralytics.com/fr/sitemap.xml +Sitemap: https://docs.ultralytics.com/hi/sitemap.xml +Sitemap: https://docs.ultralytics.com/ja/sitemap.xml +Sitemap: https://docs.ultralytics.com/ko/sitemap.xml +Sitemap: https://docs.ultralytics.com/pt/sitemap.xml +Sitemap: https://docs.ultralytics.com/ru/sitemap.xml +Sitemap: https://docs.ultralytics.com/zh/sitemap.xml diff --git a/docs/en/tasks/detect.md b/docs/en/tasks/detect.md index d147364b9ad..23bb0f7e54a 100644 --- a/docs/en/tasks/detect.md +++ b/docs/en/tasks/detect.md @@ -41,7 +41,7 @@ YOLOv8 pretrained Detect models are shown here. Detect, Segment and Pose models | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val detect data=coco.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val detect data=coco.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val detect data=coco128.yaml batch=1 device=0|cpu` ## Train diff --git a/docs/en/tasks/pose.md b/docs/en/tasks/pose.md index 17153ffd17c..401459d4572 100644 --- a/docs/en/tasks/pose.md +++ b/docs/en/tasks/pose.md @@ -42,7 +42,7 @@ YOLOv8 pretrained Pose models are shown here. Detect, Segment and Pose models ar | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** values are for single-model single-scale on [COCO Keypoints val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val pose data=coco-pose.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO Keypoints val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val pose data=coco-pose.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` ## Train diff --git a/docs/en/tasks/segment.md b/docs/en/tasks/segment.md index ea55c060d35..4954250bb37 100644 --- a/docs/en/tasks/segment.md +++ b/docs/en/tasks/segment.md @@ -41,7 +41,7 @@ YOLOv8 pretrained Segment models are shown here. Detect, Segment and Pose models | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset.
Reproduce by `yolo val segment data=coco.yaml device=0` +- **mAPval** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset.
Reproduce by `yolo val segment data=coco.yaml device=0` - **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance.
Reproduce by `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` ## Train diff --git a/docs/en/yolov5/tutorials/tips_for_best_training_results.md b/docs/en/yolov5/tutorials/tips_for_best_training_results.md index 22bfdb221e0..10dcff3748c 100644 --- a/docs/en/yolov5/tutorials/tips_for_best_training_results.md +++ b/docs/en/yolov5/tutorials/tips_for_best_training_results.md @@ -60,6 +60,6 @@ Before modifying anything, **first train with default settings to establish a pe ## Further Reading -If you'd like to know more, a good place to start is Karpathy's 'Recipe for Training Neural Networks', which has great ideas for training that apply broadly across all ML domains: [http://karpathy.github.io/2019/04/25/recipe/](http://karpathy.github.io/2019/04/25/recipe/) +If you'd like to know more, a good place to start is Karpathy's 'Recipe for Training Neural Networks', which has great ideas for training that apply broadly across all ML domains: [https://karpathy.github.io/2019/04/25/recipe/](https://karpathy.github.io/2019/04/25/recipe/) Good luck 🍀 and let us know if you have any other questions! diff --git a/docs/en/yolov5/tutorials/train_custom_data.md b/docs/en/yolov5/tutorials/train_custom_data.md index 41f5cfc8965..6784b543b9d 100644 --- a/docs/en/yolov5/tutorials/train_custom_data.md +++ b/docs/en/yolov5/tutorials/train_custom_data.md @@ -77,7 +77,7 @@ Export in `YOLOv5 Pytorch` format, then copy the snippet into your training scri ### 2.1 Create `dataset.yaml` -[COCO128](https://www.kaggle.com/ultralytics/coco128) is an example small tutorial dataset composed of the first 128 images in [COCO](http://cocodataset.org/#home) train2017. These same 128 images are used for both training and validation to verify our training pipeline is capable of overfitting. [data/coco128.yaml](https://github.com/ultralytics/yolov5/blob/master/data/coco128.yaml), shown below, is the dataset config file that defines 1) the dataset root directory `path` and relative paths to `train` / `val` / `test` image directories (or *.txt files with image paths) and 2) a class `names` dictionary: +[COCO128](https://www.kaggle.com/ultralytics/coco128) is an example small tutorial dataset composed of the first 128 images in [COCO](https://cocodataset.org/) train2017. These same 128 images are used for both training and validation to verify our training pipeline is capable of overfitting. [data/coco128.yaml](https://github.com/ultralytics/yolov5/blob/master/data/coco128.yaml), shown below, is the dataset config file that defines 1) the dataset root directory `path` and relative paths to `train` / `val` / `test` image directories (or *.txt files with image paths) and 2) a class `names` dictionary: ```yaml # Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..] diff --git a/docs/es/tasks/detect.md b/docs/es/tasks/detect.md index 439eb9215a3..7b31d62cf2a 100644 --- a/docs/es/tasks/detect.md +++ b/docs/es/tasks/detect.md @@ -41,7 +41,7 @@ Los [modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- Los valores de **mAPval** son para un solo modelo a una sola escala en el conjunto de datos [COCO val2017](http://cocodataset.org). +- Los valores de **mAPval** son para un solo modelo a una sola escala en el conjunto de datos [COCO val2017](https://cocodataset.org).
Reproduce utilizando `yolo val detect data=coco.yaml device=0` - La **Velocidad** es el promedio sobre las imágenes de COCO val utilizando una instancia [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Reproduce utilizando `yolo val detect data=coco128.yaml batch=1 device=0|cpu` diff --git a/docs/es/tasks/pose.md b/docs/es/tasks/pose.md index bed0bdff90f..50e8ea698c1 100644 --- a/docs/es/tasks/pose.md +++ b/docs/es/tasks/pose.md @@ -42,7 +42,7 @@ Los [modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- Los valores de **mAPval** son para un solo modelo a una sola escala en el conjunto de datos [COCO Keypoints val2017](http://cocodataset.org). +- Los valores de **mAPval** son para un solo modelo a una sola escala en el conjunto de datos [COCO Keypoints val2017](https://cocodataset.org).
Reproducir con `yolo val pose data=coco-pose.yaml device=0` - **Velocidad** promediada sobre imágenes COCO val usando una instancia [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Reproducir con `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` diff --git a/docs/es/tasks/segment.md b/docs/es/tasks/segment.md index 81c81c8aa13..b152d5a7178 100644 --- a/docs/es/tasks/segment.md +++ b/docs/es/tasks/segment.md @@ -41,7 +41,7 @@ Los [Modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- Los valores **mAPval** son para un único modelo a una única escala en el conjunto de datos [COCO val2017](http://cocodataset.org). +- Los valores **mAPval** son para un único modelo a una única escala en el conjunto de datos [COCO val2017](https://cocodataset.org).
Reproducir utilizando `yolo val segment data=coco.yaml device=0` - La **Velocidad** promediada sobre imágenes de COCO val utilizando una instancia de [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Reproducir utilizando `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` diff --git a/docs/fr/tasks/detect.md b/docs/fr/tasks/detect.md index b83cbae6517..75fa0272102 100644 --- a/docs/fr/tasks/detect.md +++ b/docs/fr/tasks/detect.md @@ -41,7 +41,7 @@ Les modèles pré-entraînés Detect YOLOv8 sont présentés ici. Les modèles D | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- Les valeurs de **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](http://cocodataset.org). +- Les valeurs de **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](https://cocodataset.org).
Reproductible avec `yolo val detect data=coco.yaml device=0` - La **Vitesse** est moyennée sur les images COCO val en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/fr/ec2/instance-types/p4/).
Reproductible avec `yolo val detect data=coco128.yaml batch=1 device=0|cpu` diff --git a/docs/fr/tasks/pose.md b/docs/fr/tasks/pose.md index 4db14ffd266..73d8c10e111 100644 --- a/docs/fr/tasks/pose.md +++ b/docs/fr/tasks/pose.md @@ -33,7 +33,7 @@ Les [Modèles](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/ | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- Les valeurs de **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO Keypoints val2017](http://cocodataset.org). +- Les valeurs de **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO Keypoints val2017](https://cocodataset.org).
Reproduire avec `yolo val pose data=coco-pose.yaml device=0` - La **vitesse** moyenne sur les images de validation COCO en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Reproduire avec `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` diff --git a/docs/fr/tasks/segment.md b/docs/fr/tasks/segment.md index 793d5a8b113..adf5ab79d29 100644 --- a/docs/fr/tasks/segment.md +++ b/docs/fr/tasks/segment.md @@ -41,7 +41,7 @@ Les [modèles](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/ | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- Les valeurs **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](http://cocodataset.org). +- Les valeurs **mAPval** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](https://cocodataset.org).
Pour reproduire, utilisez `yolo val segment data=coco.yaml device=0` - **Vitesse** moyennée sur les images COCO val en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Pour reproduire, utilisez `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` diff --git a/docs/hi/tasks/detect.md b/docs/hi/tasks/detect.md index f46535fa7c6..db0a830f7b2 100644 --- a/docs/hi/tasks/detect.md +++ b/docs/hi/tasks/detect.md @@ -42,7 +42,7 @@ YOLOv8 पूर्व प्रशिक्षित Detect मॉडल यह | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** मान को [COCO val2017](http://cocodataset.org) डेटासेट पर सिंगल-मॉडेल सिंगल-स्केल के लिए है। +- **mAPval** मान को [COCO val2017](https://cocodataset.org) डेटासेट पर सिंगल-मॉडेल सिंगल-स्केल के लिए है।
`yolo` द्वारा पुनः उत्पन्न करें `के द्वारा विन्यास करें yolo val data=coco.yaml device=0` - **Speed** [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) इंस्टेंस का उपयोग करके COCO val छवियों पर औसत लिया जाता है। diff --git a/docs/hi/tasks/pose.md b/docs/hi/tasks/pose.md index 6aa41e3e910..85fc6e4a9d1 100644 --- a/docs/hi/tasks/pose.md +++ b/docs/hi/tasks/pose.md @@ -42,7 +42,7 @@ YOLOv8 पूर्वानुमानित पोज मॉडलस यह | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** मान एकल मॉडल एकल स्केल पर [COCO कीपॉइंट val2017](http://cocodataset.org) डेटासेट पर है। +- **mAPval** मान एकल मॉडल एकल स्केल पर [COCO कीपॉइंट val2017](https://cocodataset.org) डेटासेट पर है।
`yolo val pose data=coco-pose.yaml device=0` के द्वारा पुनरोत्पादित करें - **Speed** [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) इन्स्टेंस का उपयोग करते हुए COCO val छवियों पर औसतित गणना।
`yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` के द्वारा पुनरार्चन करें diff --git a/docs/hi/tasks/segment.md b/docs/hi/tasks/segment.md index 927c2da11cd..a5f5f413761 100644 --- a/docs/hi/tasks/segment.md +++ b/docs/hi/tasks/segment.md @@ -39,7 +39,7 @@ YOLOv8 पूर्व प्रशिक्षित Segment मॉडल य | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** मान एकल मॉडल एकल स्केल के लिए [COCO val2017](http://cocodataset.org) डेटासेट पर होते हैं। +- **mAPval** मान एकल मॉडल एकल स्केल के लिए [COCO val2017](https://cocodataset.org) डेटासेट पर होते हैं।
`yolo val segment data=coco.yaml device=0` के द्वारा पुनर्जीवित किए जाएं। - **स्पीड** एक [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) इंस्टेंस का उपयोग करते हुए COCO val छवियों के बीच औसतन।
`yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` के द्वारा पुनर्जीवित किए जा सकते हैं। diff --git a/docs/ja/tasks/detect.md b/docs/ja/tasks/detect.md index 32ca490834e..6fc6f43edf4 100644 --- a/docs/ja/tasks/detect.md +++ b/docs/ja/tasks/detect.md @@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 物体検出, 事前訓練済みモデル, トレ | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** の値は[COCO val2017](http://cocodataset.org)データセットにおいて、単一モデル単一スケールでのものです。 +- **mAPval** の値は[COCO val2017](https://cocodataset.org)データセットにおいて、単一モデル単一スケールでのものです。
再現方法: `yolo val detect data=coco.yaml device=0` - **速度** は[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用してCOCO val画像に対して平均化されたものです。
再現方法: `yolo val detect data=coco128.yaml batch=1 device=0|cpu` diff --git a/docs/ja/tasks/pose.md b/docs/ja/tasks/pose.md index b78452bed34..730d20f5bba 100644 --- a/docs/ja/tasks/pose.md +++ b/docs/ja/tasks/pose.md @@ -42,7 +42,7 @@ YOLOv8事前トレーニング済みポーズモデルはこちらです。Detec | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** の値は、[COCO Keypoints val2017](http://cocodataset.org)データセットでの単一モデル単一スケールに対するものです。 +- **mAPval** の値は、[COCO Keypoints val2017](https://cocodataset.org)データセットでの単一モデル単一スケールに対するものです。
再現方法 `yolo val pose data=coco-pose.yaml device=0` - **速度** は [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用したCOCO val画像の平均です。
再現方法 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` diff --git a/docs/ja/tasks/segment.md b/docs/ja/tasks/segment.md index 7b2fb6e4fa8..1d4024881d5 100644 --- a/docs/ja/tasks/segment.md +++ b/docs/ja/tasks/segment.md @@ -41,7 +41,7 @@ keywords: yolov8, インスタンスセグメンテーション, Ultralytics, CO | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval**の値は[COCO val2017](http://cocodataset.org)データセットでの単一モデル単一スケールの値です。 +- **mAPval**の値は[COCO val2017](https://cocodataset.org)データセットでの単一モデル単一スケールの値です。
再現するには `yolo val segment data=coco.yaml device=0` - **スピード**は[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用してCOCO val画像で平均化されます。
再現するには `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` diff --git a/docs/ko/tasks/detect.md b/docs/ko/tasks/detect.md index 20d2b9e8a60..68646203c36 100644 --- a/docs/ko/tasks/detect.md +++ b/docs/ko/tasks/detect.md @@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 객체 감지, 사전 훈련된 모델, 훈련, | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** 값은 [COCO val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일을 사용한 값입니다. +- **mAPval** 값은 [COCO val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일을 사용한 값입니다.
[COCO](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco.yaml) 데이터와 `yolo val detect data=coco.yaml device=0` 명령으로 재현할 수 있습니다. - **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 사용해 COCO val 이미지들을 평균한 것입니다.
[COCO128](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco128.yaml) 데이터와 `yolo val detect data=coco128.yaml batch=1 device=0|cpu` 명령으로 재현할 수 있습니다. diff --git a/docs/ko/tasks/pose.md b/docs/ko/tasks/pose.md index 58cc0c5b1dc..d83873107f0 100644 --- a/docs/ko/tasks/pose.md +++ b/docs/ko/tasks/pose.md @@ -42,7 +42,7 @@ keywords: Ultralytics, YOLO, YOLOv8, 포즈 추정, 키포인트 검출, 객체 | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** 값은 [COCO Keypoints val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 규모를 기준으로 합니다. +- **mAPval** 값은 [COCO Keypoints val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 규모를 기준으로 합니다.
재현하려면 `yolo val pose data=coco-pose.yaml device=0`을 사용하세요. - **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 사용하여 COCO val 이미지 평균입니다.
재현하려면 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`를 사용하세요. diff --git a/docs/ko/tasks/segment.md b/docs/ko/tasks/segment.md index d0d537c0ddf..d2f9acdefc9 100644 --- a/docs/ko/tasks/segment.md +++ b/docs/ko/tasks/segment.md @@ -41,7 +41,7 @@ keywords: yolov8, 인스턴스 세그멘테이션, Ultralytics, COCO 데이터 | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** 값들은 [COCO val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일로 얻은 값입니다. +- **mAPval** 값들은 [COCO val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일로 얻은 값입니다.
복제는 `yolo val segment data=coco.yaml device=0` 명령어로 실행할 수 있습니다. - **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 이용하여 COCO 검증 이미지로 평균 내었습니다.
복제는 `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` 명령어로 실행할 수 있습니다. diff --git a/docs/pt/tasks/detect.md b/docs/pt/tasks/detect.md index 300d106d5e1..c6b655cc182 100644 --- a/docs/pt/tasks/detect.md +++ b/docs/pt/tasks/detect.md @@ -41,7 +41,7 @@ Os [Modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/cf | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- Os valores de **mAPval** são para um único modelo e uma única escala no dataset [COCO val2017](http://cocodataset.org). +- Os valores de **mAPval** são para um único modelo e uma única escala no dataset [COCO val2017](https://cocodataset.org).
Reproduza usando `yolo val detect data=coco.yaml device=0` - A **Velocidade** é média tirada sobre as imagens do COCO val num [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instância. diff --git a/docs/pt/tasks/pose.md b/docs/pt/tasks/pose.md index 1ac6b6adc1c..f556cbed126 100644 --- a/docs/pt/tasks/pose.md +++ b/docs/pt/tasks/pose.md @@ -42,7 +42,7 @@ Os modelos YOLOv8 Pose pré-treinados são mostrados aqui. Os modelos Detect, Se | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** valores são para um único modelo em escala única no conjunto de dados [COCO Keypoints val2017](http://cocodataset.org) +- **mAPval** valores são para um único modelo em escala única no conjunto de dados [COCO Keypoints val2017](https://cocodataset.org) .
Reproduza `yolo val pose data=coco-pose.yaml device=0` - **Velocidade** média em imagens COCO val usando uma instância [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) diff --git a/docs/pt/tasks/segment.md b/docs/pt/tasks/segment.md index 2dc2867ac98..df0d7ee2d15 100644 --- a/docs/pt/tasks/segment.md +++ b/docs/pt/tasks/segment.md @@ -41,7 +41,7 @@ Os modelos Segment pré-treinados do YOLOv8 estão mostrados aqui. Os modelos De | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- Os valores de **mAPval** são para um único modelo em uma única escala no conjunto de dados [COCO val2017](http://cocodataset.org). +- Os valores de **mAPval** são para um único modelo em uma única escala no conjunto de dados [COCO val2017](https://cocodataset.org).
Reproduza por meio de `yolo val segment data=coco.yaml device=0` - **Velocidade** média em imagens COCO val usando uma instância [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Reproduza por meio de `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` diff --git a/docs/ru/tasks/detect.md b/docs/ru/tasks/detect.md index dc601c02e81..5bc78d366fb 100644 --- a/docs/ru/tasks/detect.md +++ b/docs/ru/tasks/detect.md @@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, обнаружение объектов, пред | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** значения для одиночной модели одиночного масштаба на датасете [COCO val2017](http://cocodataset.org). +- **mAPval** значения для одиночной модели одиночного масштаба на датасете [COCO val2017](https://cocodataset.org).
Для воспроизведения используйте `yolo val detect data=coco.yaml device=0` - **Скорость** усреднена по изображениям COCO val на экземпляре [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
Для воспроизведения используйте `yolo val detect data=coco128.yaml batch=1 device=0|cpu` diff --git a/docs/ru/tasks/pose.md b/docs/ru/tasks/pose.md index 7710694b477..9a9107cab35 100644 --- a/docs/ru/tasks/pose.md +++ b/docs/ru/tasks/pose.md @@ -32,7 +32,7 @@ description: Узнайте, как использовать Ultralytics YOLOv8 | [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** значения для одной модели одиночного масштаба на наборе данных [COCO Keypoints val2017](http://cocodataset.org). +- **mAPval** значения для одной модели одиночного масштаба на наборе данных [COCO Keypoints val2017](https://cocodataset.org).
Воспроизводится с помощью: `yolo val pose data=coco-pose.yaml device=0` - **Скорость** усреднена по изображениям COCO val на [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) инстансе.
Воспроизводится с помощью: `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` diff --git a/docs/ru/tasks/segment.md b/docs/ru/tasks/segment.md index 637f485647f..3288b03d99a 100644 --- a/docs/ru/tasks/segment.md +++ b/docs/ru/tasks/segment.md @@ -41,7 +41,7 @@ keywords: yolov8, сегментация объектов, Ultralytics, набо | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- Значения **mAPval** для одиночной модели одиночного масштаба на наборе данных [COCO val2017](http://cocodataset.org). +- Значения **mAPval** для одиночной модели одиночного масштаба на наборе данных [COCO val2017](https://cocodataset.org).
Воспроизведите с помощью `yolo val segment data=coco.yaml device=0` - **Скорость** усреднена для изображений COCO val на [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) инстансе. diff --git a/docs/zh/tasks/detect.md b/docs/zh/tasks/detect.md index 3a4537c5e76..deb4564b40e 100644 --- a/docs/zh/tasks/detect.md +++ b/docs/zh/tasks/detect.md @@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 目标检测, 预训练模型, 训练, 验证, | [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 | | [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 | -- **mAPval** 值适用于 [COCO val2017](http://cocodataset.org) 数据集上的单模型单尺度。 +- **mAPval** 值适用于 [COCO val2017](https://cocodataset.org) 数据集上的单模型单尺度。
通过 `yolo val detect data=coco.yaml device=0` 复现。 - **速度** 是在使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 云实例对COCO val图像的平均值。
通过 `yolo val detect data=coco128.yaml batch=1 device=0|cpu` 复现。 diff --git a/docs/zh/tasks/pose.md b/docs/zh/tasks/pose.md index a5db8ad1904..44c92b1ca30 100644 --- a/docs/zh/tasks/pose.md +++ b/docs/zh/tasks/pose.md @@ -42,7 +42,7 @@ keywords: Ultralytics, YOLO, YOLOv8, 姿态估计, 关键点检测, 物体检测 | [YOLOv8x-姿态](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 | | [YOLOv8x-姿态-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 | -- **mAPval** 值适用于[COCO 关键点 val2017](http://cocodataset.org)数据集上的单模型单尺度。 +- **mAPval** 值适用于[COCO 关键点 val2017](https://cocodataset.org)数据集上的单模型单尺度。
通过执行 `yolo val pose data=coco-pose.yaml device=0` 来复现。 - **速度** 是在 [亚马逊EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)实例上使用COCO val图像的平均值。
通过执行 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` 来复现。 diff --git a/docs/zh/tasks/segment.md b/docs/zh/tasks/segment.md index 4f4aff5ff8c..9d9404db968 100644 --- a/docs/zh/tasks/segment.md +++ b/docs/zh/tasks/segment.md @@ -41,7 +41,7 @@ keywords: yolov8, 实例分割, Ultralytics, COCO数据集, 图像分割, 物体 | [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 | | [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 | -- **mAPval** 值针对[COCO val2017](http://cocodataset.org)数据集的单模型单尺度。 +- **mAPval** 值针对[COCO val2017](https://cocodataset.org)数据集的单模型单尺度。
通过`yolo val segment data=coco.yaml device=0`复现。 - **速度** 基于在[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)实例上运行的COCO val图像的平均值。
通过`yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`复现。 diff --git a/tests/test_python.py b/tests/test_python.py index 792eb07bb0a..46d28762860 100644 --- a/tests/test_python.py +++ b/tests/test_python.py @@ -520,7 +520,7 @@ def test_hub(): export_fmts_hub() logout() - smart_request('GET', 'http://github.com', progress=True) + smart_request('GET', 'https://github.com', progress=True) @pytest.fixture diff --git a/ultralytics/cfg/datasets/Argoverse.yaml b/ultralytics/cfg/datasets/Argoverse.yaml index 52f4e79dfee..6c3690bdfa4 100644 --- a/ultralytics/cfg/datasets/Argoverse.yaml +++ b/ultralytics/cfg/datasets/Argoverse.yaml @@ -1,5 +1,5 @@ # Ultralytics YOLO 🚀, AGPL-3.0 license -# Argoverse-HD dataset (ring-front-center camera) http://www.cs.cmu.edu/~mengtial/proj/streaming/ by Argo AI +# Argoverse-HD dataset (ring-front-center camera) https://www.cs.cmu.edu/~mengtial/proj/streaming/ by Argo AI # Documentation: https://docs.ultralytics.com/datasets/detect/argoverse/ # Example usage: yolo train data=Argoverse.yaml # parent diff --git a/ultralytics/cfg/datasets/GlobalWheat2020.yaml b/ultralytics/cfg/datasets/GlobalWheat2020.yaml index 9d47bfe47d9..712fec358ad 100644 --- a/ultralytics/cfg/datasets/GlobalWheat2020.yaml +++ b/ultralytics/cfg/datasets/GlobalWheat2020.yaml @@ -1,5 +1,5 @@ # Ultralytics YOLO 🚀, AGPL-3.0 license -# Global Wheat 2020 dataset http://www.global-wheat.com/ by University of Saskatchewan +# Global Wheat 2020 dataset https://www.global-wheat.com/ by University of Saskatchewan # Documentation: https://docs.ultralytics.com/datasets/detect/globalwheat2020/ # Example usage: yolo train data=GlobalWheat2020.yaml # parent diff --git a/ultralytics/cfg/datasets/coco-pose.yaml b/ultralytics/cfg/datasets/coco-pose.yaml index 0b72d86c1ff..849529ce18a 100644 --- a/ultralytics/cfg/datasets/coco-pose.yaml +++ b/ultralytics/cfg/datasets/coco-pose.yaml @@ -1,5 +1,5 @@ # Ultralytics YOLO 🚀, AGPL-3.0 license -# COCO 2017 dataset http://cocodataset.org by Microsoft +# COCO 2017 dataset https://cocodataset.org by Microsoft # Documentation: https://docs.ultralytics.com/datasets/pose/coco/ # Example usage: yolo train data=coco-pose.yaml # parent diff --git a/ultralytics/cfg/datasets/coco.yaml b/ultralytics/cfg/datasets/coco.yaml index 3febd5f20b4..8881ed6dac4 100644 --- a/ultralytics/cfg/datasets/coco.yaml +++ b/ultralytics/cfg/datasets/coco.yaml @@ -1,5 +1,5 @@ # Ultralytics YOLO 🚀, AGPL-3.0 license -# COCO 2017 dataset http://cocodataset.org by Microsoft +# COCO 2017 dataset https://cocodataset.org by Microsoft # Documentation: https://docs.ultralytics.com/datasets/detect/coco/ # Example usage: yolo train data=coco.yaml # parent diff --git a/ultralytics/data/scripts/get_coco.sh b/ultralytics/data/scripts/get_coco.sh index 126e7f07d69..764e280a4ea 100755 --- a/ultralytics/data/scripts/get_coco.sh +++ b/ultralytics/data/scripts/get_coco.sh @@ -1,6 +1,6 @@ #!/bin/bash # Ultralytics YOLO 🚀, AGPL-3.0 license -# Download COCO 2017 dataset http://cocodataset.org +# Download COCO 2017 dataset https://cocodataset.org # Example usage: bash data/scripts/get_coco.sh # parent # ├── ultralytics