Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix PoW and other functions not working with USE_TORCH=0 despite torch being available #1917

Merged

Conversation

mjurbanski-reef
Copy link
Contributor

@mjurbanski-reef mjurbanski-reef commented May 22, 2024

Bug

This fixes problem that after #1904 uses wanting PoW and other torch-using functionality were forced to use USE_TORCH=1 which also forced the legacy torch-based interface.

Description of the Change

  • users can now use torch-requiring code despite using new interface - this should prevent users being even more reluctant to upgrade just because they use few functions that require torch
  • the default value in example.env is now USE_TORCH=0, so users don't accidentally run legacy interface - we want old users to move to new one as soon as it is viable AND none of the new users should accidentally run legacy interface
  • fixed recursion error in lazy torch loader class which happend when requesting nonexistent attribute
  • refactored the lazy torch loader and added tests for it
  • added legacy_torch_api_compat for an easier way of maintaining two interfaces, but only used it in limited fashion for now

Alternate Designs

as written above, no

Possible Drawbacks

No real drawbacks that Im aware of.

Verification Process

Unit tests added and manual tests.

Release Notes

None, this fixes regression introduced by #1904 that was yet to be released.

@mjurbanski-reef mjurbanski-reef force-pushed the improve_no_torch branch 2 times, most recently from 85cdc00 to db3ef49 Compare May 22, 2024 12:38

U32_MAX = 4294967295
U16_MAX = 65535


@legacy_torch_api_compat
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO this approach of just having one implementation of a function and then translating only input and output will be much easier to maintain, but I leave it to others if they want to use it in more places or if they want to keep using if use_torch() all over the place

@mjurbanski-reef
Copy link
Contributor Author

@thewhaleking as author of #1904 you maybe interested in this PR

@gus-opentensor this probably should also land in v7, as it will make users less reluctant to switch to numpy interface

Copy link
Contributor

@thewhaleking thewhaleking left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This overall looks good to me.

def __init__(self):
self._transformed = False
def legacy_torch_api_compat(func):
"""Decorator to convert numpy arrays to torch tensors before passing them to the function.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really like this approach. I would recommend updating this line, as it reads as the opposite of what it's saying. Other than that, this seems great.

@thewhaleking thewhaleking merged commit f54157f into opentensor:staging May 22, 2024
11 checks passed
ibraheem-opentensor pushed a commit that referenced this pull request May 24, 2024
fix PoW and other functions not working with USE_TORCH=0 despite torch being available
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants