You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the YOLOv5 issues and discussions and found no similar questions.
Question
Hi all! I am facing inconsistent inference times.
I am on windows, and running the inference on CPU.
The inference time is always 84/85 ms for lets say 30 pictures straight.. then there are 1 or 2 that take 125ms or even more. and then it goes back again to the usual 84 ms. its not picture related as the same picture takes 84 ms if run again.
any idea on what might be causing these inconsistencies?
there are no other programs using the cpu, just the OS.
Thank you
Best regards
------------ EDIT --------------
it is this line that has that variable time:
pred = model(im, augment=False, visualize=False)
Additional
No response
The text was updated successfully, but these errors were encountered:
Inconsistency in inference time can be due to many factors like hardware performance, memory, CPU usage, and so on.
Our recommendation would be to profile the inference execution and check where is it spending more time. You can use python's built-in cProfile library to profile the inference execution and pstats for the analysis.
Another suggestion could be to try running the inference on GPU or a Linux machine (if possible) and see if the inconsistency is still there.
👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help.
For additional resources and information, please see the links below:
Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!
Thank you for your contributions to YOLO 🚀 and Vision AI ⭐
Search before asking
Question
Hi all! I am facing inconsistent inference times.
I am on windows, and running the inference on CPU.
The inference time is always 84/85 ms for lets say 30 pictures straight.. then there are 1 or 2 that take 125ms or even more. and then it goes back again to the usual 84 ms. its not picture related as the same picture takes 84 ms if run again.
any idea on what might be causing these inconsistencies?
there are no other programs using the cpu, just the OS.
Thank you
Best regards
------------ EDIT --------------
it is this line that has that variable time:
pred = model(im, augment=False, visualize=False)
Additional
No response
The text was updated successfully, but these errors were encountered: