Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameter performance indicators #13033

Closed
1 task done
mjgmjgmjg opened this issue May 21, 2024 · 6 comments
Closed
1 task done

Parameter performance indicators #13033

mjgmjgmjg opened this issue May 21, 2024 · 6 comments
Labels
question Further information is requested Stale

Comments

@mjgmjgmjg
Copy link

Search before asking

Question

Hello, I want to ask you a question. I am using the yolov5-5.0 version.After test.py is run, why are the parameter indicators such as P and R displayed on the console different from those in the generated exp folder? Why is it different?
微信图片_20240521110944

Additional

No response

@mjgmjgmjg mjgmjgmjg added the question Further information is requested label May 21, 2024
@glenn-jocher
Copy link
Member

Hello! Thanks for reaching out with your question. 🌟

The difference in the parameter indicators like Precision (P) and Recall (R) between what's displayed in the console and the values in the exp folder could be due to a few reasons:

  1. Averaging Differences: The console might display the latest or average values during the test run, while the exp folder could contain values that are aggregated or processed differently (e.g., averaged over the entire dataset or over different batches).

  2. Different Data Splits: If there are multiple data splits (like validation splits), the results might be calculated separately for each and then averaged, which could lead to discrepancies if viewed at different times.

  3. Updates in Metrics Calculation: There might be updates or changes in how metrics are calculated between different runs or versions of the code that are not immediately reflected in all outputs.

To troubleshoot, you can:

  • Ensure that you are looking at the same metric calculations in both the console and the exp folder.
  • Check if any post-processing steps differ between the console output and the saved results.

If the issue persists, consider reviewing the specific code sections that handle metric logging and result saving to ensure consistency. If you need further assistance, feel free to ask! 😊

@mjgmjgmjg
Copy link
Author

So if I publish a paper and need to use P, R and other indicators, should I use the P, R and other indicators displayed on the console or the P, R and other indicators in the generated exp folder? why?

@glenn-jocher
Copy link
Member

@mjgmjgmjg hello!

For publishing in a paper, it's generally best to use the indicators from the exp folder. These are typically the final, processed results that take into account the entire dataset and are intended for review and analysis. The console output might provide immediate feedback during testing but can vary depending on the specific segment of data being processed at the time.

Using the exp folder results ensures consistency and reproducibility in your reported metrics, which is crucial for academic publishing. 😊

@mjgmjgmjg
Copy link
Author

alright, thank you very much

@glenn-jocher
Copy link
Member

@mjgmjgmjg you're welcome! If you have any more questions in the future, feel free to ask. Happy coding! 😊

Copy link
Contributor

👋 Hello there! We wanted to give you a friendly reminder that this issue has not had any recent activity and may be closed soon, but don't worry - you can always reopen it if needed. If you still have any questions or concerns, please feel free to let us know how we can help.

For additional resources and information, please see the links below:

Feel free to inform us of any other issues you discover or feature requests that come to mind in the future. Pull Requests (PRs) are also always welcomed!

Thank you for your contributions to YOLO 🚀 and Vision AI ⭐

@github-actions github-actions bot added the Stale label Jun 21, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested Stale
Projects
None yet
Development

No branches or pull requests

2 participants