Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUGFIX: Unpin tensorflow_datasets #241

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

ethanluoyc
Copy link
Contributor

The current pinned version does not work for some of the d4rl locomotion v2 datasets due to the wrong specification in metadata types.

The current pinned version does not work for some of the d4rl locomotion v2 datasets due to the wrong specification in metadata types.
@ethanluoyc
Copy link
Contributor Author

Not sure what the tests failure are about tho.

@qstanczyk
Copy link
Collaborator

We use specific versions of packages to make sure new releases of dependencies don't break things at a random (which happens a number of times). Could you use specific version which solves the problem?

@ethanluoyc
Copy link
Contributor Author

ethanluoyc commented Jul 5, 2022

We use specific versions of packages to make sure new releases of dependencies don't break things at a random (which happens a number of times). Could you use specific version which solves the problem?

I understand that Reverb and Launchpad depend on the ABI of exact versions of tensorflow, so it is important to pin the versions of lp and reverb against a specific version to make sure that users do not accidentally install versions that break. In this case, however, I think tf-datasets does not have any C++ extensions (the C++ extensions for tf.data are in core tensorflow, not tf-datasets), so I think it is unnecessary to pin the tf-dataset version :)

In general, I would suggest that we should make requirements in setup.py more flexible. Pinning the dependencies in the way it is makes the life of users who depend on Acme as a library very difficult: there is no freedom in choosing the version of the dependencies that Acme should use. pip will throw an error when I try to install dm-acme with an alternative version of tf-datasets as it is unable to find compatible versions that satisfy the constraints.
Perhaps it's a good idea to provide a more abstract setup.py but instead move the concrete pinned version to a requirements.txt? I believe that the recommended approach for creating Python packages would certainly provide more flexibility for the users without having to work around errors raised by package managers. I personally use pip-compile-multi which can compile a pinned version of the requirements.txt given some more abstract requirements stored in a file named *.in. I believe the dm-control folks do that and I think that would work very nicely with Acme as well

I am happy to create a PR to help set up a more flexible way to manage dependencies and I believe that users outside DM can benefit a lot from this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants