Skip to content

Commit

Permalink
Update broken links (#498)
Browse files Browse the repository at this point in the history
* Update broken links

* Update CONTRIBUTING.md

---------

Co-authored-by: yishengWang <102451740+yishengWang18@users.noreply.github.com>
  • Loading branch information
JingxianKe and JingxianKe committed Jul 31, 2023
1 parent cb9b4ce commit 12809ca
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 8 deletions.
2 changes: 1 addition & 1 deletion .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
### Install yolort

```bash
git clone https://github.com/zhiqwang/yolov5-rt-stack.git
git clone https://github.com/zhiqwang/yolort.git
cd yolov5-rt-stack
pip install -e .
```
Expand Down
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@

______________________________________________________________________

[Documentation](https://zhiqwang.com/yolov5-rt-stack/)
[Installation Instructions](https://zhiqwang.com/yolov5-rt-stack/installation.html)
[Documentation](https://zhiqwang.com/yolort/)
[Installation Instructions](https://zhiqwang.com/yolort/installation.html)
[Deployment](#-deployment)
[Contributing](.github/CONTRIBUTING.md)
[Reporting Issues](https://github.com/zhiqwang/yolov5-rt-stack/issues/new?assignees=&labels=&template=bug-report.yml)
Expand Down Expand Up @@ -111,7 +111,7 @@ model = torch.hub.load("zhiqwang/yolov5-rt-stack:main", "yolov5s", pretrained=Tr

### Loading checkpoint from official yolov5

The following is the interface for loading the checkpoint weights trained with `ultralytics/yolov5`. Please see our documents on what we [share](https://zhiqwang.com/yolov5-rt-stack/notebooks/how-to-align-with-ultralytics-yolov5.html) and how we [differ](https://zhiqwang.com/yolov5-rt-stack/notebooks/comparison-between-yolort-vs-yolov5.html) from yolov5 for more details.
The following is the interface for loading the checkpoint weights trained with `ultralytics/yolov5`. Please see our documents on what we [share](https://zhiqwang.com/yolort/notebooks/how-to-align-with-ultralytics-yolov5.html) and how we [differ](https://zhiqwang.com/yolort/notebooks/comparison-between-yolort-vs-yolov5.html) from yolov5 for more details.

```python
from yolort.models import YOLOv5
Expand All @@ -129,7 +129,7 @@ predictions = model.predict(img_path)

### Inference on LibTorch backend

We provide a [tutorial](https://zhiqwang.com/yolov5-rt-stack/notebooks/inference-pytorch-export-libtorch.html) to demonstrate how the model is converted into `torchscript`. And we provide a [C++ example](deployment/libtorch) of how to do inference with the serialized `torchscript` model.
We provide a [tutorial](https://zhiqwang.com/yolort/notebooks/inference-pytorch-export-libtorch.html) to demonstrate how the model is converted into `torchscript`. And we provide a [C++ example](deployment/libtorch) of how to do inference with the serialized `torchscript` model.

### Inference on ONNX Runtime backend

Expand All @@ -146,7 +146,7 @@ y_runtime = PredictorORT(engine_path, device="cpu")
predictions = y_runtime.predict("bus.jpg")
```

Please check out this [tutorial](https://zhiqwang.com/yolov5-rt-stack/notebooks/export-onnx-inference-onnxruntime.html) to use yolort's ONNX model conversion and ONNX Runtime inferencing. And you can use the [example](deployment/onnxruntime) for ONNX Runtime C++ interface.
Please check out this [tutorial](https://zhiqwang.com/yolort/notebooks/export-onnx-inference-onnxruntime.html) to use yolort's ONNX model conversion and ONNX Runtime inferencing. And you can use the [example](deployment/onnxruntime) for ONNX Runtime C++ interface.

### Inference on TensorRT backend

Expand All @@ -165,11 +165,11 @@ y_runtime = PredictorTRT(engine_path, device=device)
predictions = y_runtime.predict("bus.jpg")
```

Besides, we provide a [tutorial](https://zhiqwang.com/yolov5-rt-stack/notebooks/onnx-graphsurgeon-inference-tensorrt.html) detailing yolort's model conversion to TensorRT and the use of the Python interface. Please check this [example](deployment/tensorrt) if you want to use the C++ interface.
Besides, we provide a [tutorial](https://zhiqwang.com/yolort/notebooks/onnx-graphsurgeon-inference-tensorrt.html) detailing yolort's model conversion to TensorRT and the use of the Python interface. Please check this [example](deployment/tensorrt) if you want to use the C++ interface.

## 🎨 Model Graph Visualization

Now, `yolort` can draw the model graph directly, checkout our [tutorial](https://zhiqwang.com/yolov5-rt-stack/notebooks/model-graph-visualization.html) to see how to use and visualize the model graph.
Now, `yolort` can draw the model graph directly, checkout our [tutorial](https://zhiqwang.com/yolort/notebooks/model-graph-visualization.html) to see how to use and visualize the model graph.

<a href="notebooks/assets/yolov5_graph_visualize.svg"><img src="notebooks/assets/yolov5_graph_visualize.svg" alt="YOLO model visualize" width="500"/></a>

Expand Down

0 comments on commit 12809ca

Please sign in to comment.