Replies: 1 comment
-
Update: I figured it out while looking at the source, but now I have an issue where performance is really bad while using OpenCV and PyTorch Hub, despite performance being realtime with detect.py Update 2: After doing some more research, it appears that detect.py uses a thing from util.datasets called loadstreams, which are much faster than OpenCV inputs. I also might experiment with converting to different model types, as TF lite will probably be faster than something like standard PyTorch. What I hope is that I’m able to get the model to run in realtime speeds. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it possible to display the output of the inference with something like OpenCV? I would like to have yolov5 running in real-time while showing its output, so it would be great if I could!
Beta Was this translation helpful? Give feedback.
All reactions