Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix benchmark #1155

Merged
merged 11 commits into from
Jun 30, 2023
Merged

Fix benchmark #1155

merged 11 commits into from
Jun 30, 2023

Conversation

blaz-r
Copy link
Contributor

@blaz-r blaz-r commented Jun 27, 2023

Description

There was a problem where TorchInferencer was changed to no longer take config and model, but rather path to model. This also reflected in functions used to benchmark torch models. Required change was therefore update of calling parameters for get_torch_throughput and also adding instruction to export torch model to be used.

Changes

  • Bug fix (non-breaking change which fixes an issue)
  • Refactor (non-breaking change which refactors the code base)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes
  • I have added a summary of my changes to the CHANGELOG (not for minor changes, docs and tests).

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 28, 2023

I now also added test that cover code to obtain throughput from sweep.

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 28, 2023

Codacy generates one warning. This happens since in my tests generate_results_dir is a fixture and imported from other module. Because it is a fixture it's not directly used, therefore producing a warning.
I'm not sure what's the best practice here. Maybe I shouldn't use fixtures from other modules, but that would introduce code duplicating.

@ashwinvaidya17
Copy link
Collaborator

Codacy generates one warning. This happens since in my tests generate_results_dir is a fixture and imported from other module. Because it is a fixture it's not directly used, therefore producing a warning. I'm not sure what's the best practice here. Maybe I shouldn't use fixtures from other modules, but that would introduce code duplicating.

@blaz-r thanks for addressing this. If you have to use a fixture across tests it is adviced to place them in conftest.py. You can have a look at tests/pre_merge/tools https://docs.pytest.org/en/6.2.x/fixture.html#scope-sharing-fixtures-across-classes-modules-packages-or-session

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 28, 2023

The thing is that generate_results_dir is inside of tests.pre_merge.deploy.test_inferencer but the tests I wrote are inside tests/pre_merge/utils/sweep/test_throughput.py. I have a feeling that using fixture from other module is not a good idea, but it does exactly what is needed for this test. So what would be the best thing to do here? One option would be to create conftest in pre_merge directory, but then again I'm not sure if that is a good idea, so I'd appreciate all the advice.

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 28, 2023

I now added conftest which contains path to file where needed fixture is defined. I think this should be good now.

Copy link
Collaborator

@ashwinvaidya17 ashwinvaidya17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am fine with these changes. Our entire test suit is going to undergo a refactor so the conftest workaround is temporary anyways. Thanks again

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 28, 2023

Well, as it turns out conftest can't have pytest_plugin inside, so the tests are failing.

From what I see this is not a good idea at all, so I think I should make conftest with this fixture inside the pre_merge module or just copy it to conftest of sweep.

I'm not sure which option would really be the best?

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 29, 2023

I am fine with these changes. Our entire test suit is going to undergo a refactor so the conftest workaround is temporary anyways. Thanks again

I just put slightly modified code into conftest. This way it should work and pytest shouldn't have problems.

@blaz-r
Copy link
Contributor Author

blaz-r commented Jun 30, 2023

@samet-akcay all check are now passing.

@samet-akcay samet-akcay merged commit 17efdb5 into openvinotoolkit:main Jun 30, 2023
@ashwinvaidya17 ashwinvaidya17 mentioned this pull request Aug 28, 2023
@blaz-r blaz-r deleted the benchmark_fix branch September 4, 2023 19:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: benchmarking type error
3 participants