Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

馃悶 Fix tensor detach and gpu count issues in benchmarking script #100

Merged
merged 1 commit into from
Feb 15, 2022

Conversation

ashwinvaidya17
Copy link
Collaborator

Description

This is a quick fix to the following issues

  • Even with model.eval(), the models returned predictions with grad_required = True
  • Due to incorrect calculation of splits, greater number of GPUs were being assigned

Changes

  • Bug fix (non-breaking change which fixes an issue)

Checklist

  • My code follows the pre-commit style and check guidelines of this project.
  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing tests pass locally with my changes

@ashwinvaidya17 ashwinvaidya17 added the Bug Something isn't working label Feb 11, 2022
@samet-akcay samet-akcay merged commit 4bf4970 into development Feb 15, 2022
@samet-akcay samet-akcay deleted the fix/ashwin/benchmarking branch February 15, 2022 08:33
@loyal-ikun loyal-ikun mentioned this pull request Dec 28, 2022
1 task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants