-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorRT #1535
Comments
I have not tried this in a long time since I have switched to an Apple computer and it cannot be run in the pipeline as no GPU are available. But you can try this to export the ReID model:
and then run inference with that model using:
|
When ı try to run reid_export.py file ı have a error like this: |
Yup, I see the issue |
How to solve this problem ? Have you any idea ? |
Pull the latest, I just pushed some new code. It is hard for me to test this part as I don't have a computer with Nvidia card and CUDA at hand. It is not runnable in the CI either... |
Okay, I will check it and give you feedback. |
The current error is as follows: |
Just added it 🚀 |
👋 Hello, this issue has been automatically marked as stale because it has not had recent activity. Please note it will be closed if no further activity occurs. |
Search before asking
Question
Hello,
I have Yolov8 model converted to trt form. However, when I run this trt model with bot-sort, I experience a fps drop. Is there a way to convert Bot-Sort to trt form?
Thanks
The text was updated successfully, but these errors were encountered: