Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

importing torch in setup.py will cause arcane exceptions when xformers is added without versions #940

Open
doctorpangloss opened this issue Nov 30, 2023 · 12 comments

Comments

@doctorpangloss
Copy link

https://github.com/facebookresearch/xformers/blame/0ec6aa244f3fbf1a077782bbf3ac8e6f2e4d1d16/setup.py#L23C5-L23C5

setup logs:

INFO: pip is looking at multiple versions of xformers to determine which version is compatible with other requirements. This could take a while.
Collecting xformers (from sd-scripts@ file://C:/Users/bberman/Documents/sd-scripts->artworkspace==0.1)
  Downloading xformers-0.0.22.post4-cp311-cp311-win_amd64.whl (97.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.9/97.9 MB 11.9 MB/s eta 0:00:00
  Using cached xformers-0.0.22-cp311-cp311-win_amd64.whl (97.6 MB)
  Downloading xformers-0.0.21-cp311-cp311-win_amd64.whl (97.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.5/97.5 MB 18.7 MB/s eta 0:00:00
  Using cached xformers-0.0.20-cp311-cp311-win_amd64.whl (97.6 MB)
Collecting pyre-extensions==0.0.29 (from xformers->sd-scripts@ file://C:/Users/bberman/Documents/sd-scripts->artworkspace==0.1)
  Using cached pyre_extensions-0.0.29-py3-none-any.whl (12 kB)
Collecting xformers (from sd-scripts@ file://C:/Users/bberman/Documents/sd-scripts->artworkspace==0.1)
  Downloading xformers-0.0.16.tar.gz (7.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.3/7.3 MB 17.2 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error

  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "C:\Users\bberman\Documents\Art-Workspace\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 353, in <module>
          main()
        File "C:\Users\bberman\Documents\Art-Workspace\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\bberman\Documents\Art-Workspace\venv\Lib\site-packages\pip\_vendor\pyproject_hooks\_in_process\_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\bberman\AppData\Local\Temp\1\pip-build-env-b9gnoz9z\overlay\Lib\site-packages\setuptools\build_meta.py", line 325, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "C:\Users\bberman\AppData\Local\Temp\1\pip-build-env-b9gnoz9z\overlay\Lib\site-packages\setuptools\build_meta.py", line 295, in _get_build_requires
          self.run_setup()
        File "C:\Users\bberman\AppData\Local\Temp\1\pip-build-env-b9gnoz9z\overlay\Lib\site-packages\setuptools\build_meta.py", line 480, in run_setup
          super(_BuildMetaLegacyBackend, self).run_setup(setup_script=setup_script)
        File "C:\Users\bberman\AppData\Local\Temp\1\pip-build-env-b9gnoz9z\overlay\Lib\site-packages\setuptools\build_meta.py", line 311, in run_setup
          exec(code, locals())
        File "<string>", line 23, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]

Observe setup.py imports torch, but my pyproject.toml has:

[build-system]
requires = ["setuptools", "wheel", "pip"]
build-backend = "setuptools.build_meta"

aka requires does not contain torch, so your setup.py cannot import it. You will have to do something else. I understand the import torch line has been around since this package was authored, but it looks like pip has changed in the meantime to start this isolation process. There will be a lot of inertia on your team around this code. Nonetheless, fixing it will probably resolve a lot of issues people are reporting.

@doctorpangloss
Copy link
Author

one solution is to gracefully fail if torch is not in the build environment, because pip is running your setup.py for metadata gathering, not to build a wheel

@danthe3rd
Copy link
Contributor

Yes that would be a solution. That would require a bit of refactoring of our setup.py file tho...

@baggiponte
Copy link

baggiponte commented Dec 21, 2023

Having the same problems here. I think you might get away with adding torch to the build-system table in a pyproject.toml, am I right? pip will install it during the build phase and I think it will successfully uninstall it afterwards.

EDIT: build works if I do this:

python -m pip install xformers --no-build-isolation

@doctorpangloss
Copy link
Author

python -m pip install xformers --no-build-isolation

you can't specify that a specific dependency should be installed without build isolation using a string in a setup.py. you could use a pip internal class to achieve this.

Yes that would be a solution. That would require a bit of refactoring of our setup.py file tho...

the only thing you import torch for is to check its version and whether cuda is available. you can achieve both without importing torch. it is a small change. another POV is that pip should be evaluating the extension build command and build requirements lazily, which would allow you to specify that torch is needed by the build process if the user actually needs to build an extension and actually import it when they do. in principle you can achieve this by modifying extension.

the import torch line has created a lot of trouble for people to specify that xformers is a dependency, so i wouldn't call this a small issue.

@baggiponte
Copy link

baggiponte commented Dec 21, 2023

python -m pip install xformers --no-build-isolation

you can't specify that a specific dependency should be installed without build isolation using a string in a setup.py. you could use a pip internal class to achieve this.

I agree. I just wanted to point out this temporary workaround while this gets solved. Of course, the best solution is the one you describe.

@doctorpangloss
Copy link
Author

Pinging this again. Until this is fixed, xformers cannot practicably be specified as a dependency without a specific version.

@baggiponte
Copy link

Agreed. It would simply be enough to add a pyproject.toml file with the following contents:

[build-system]
requires = ["setuptools", "torch>={whatever}"]
build-backend = "setuptools.build_meta"

You can find some references here.

@baggiponte
Copy link

Hey there. I am working on a very small PR to address this issue. I forked the repo and you can see the commit here. I just added a pyproject.toml file and the following section (table):

[build-system]
requires = [
  "setuptools",
  "torch>=2.0.0"
]
build-backend = "setuptools.build_meta"

As a very brief reminder, pyproject.toml is the standard to list build dependencies, AKA all libraries that are used to build a wheel from a package. Normally this is not required - since pip>=24, it is smart enough to default to setuptools if this section is not found. In case of xformers, this section must be added, as torch is a build-time dependency (it is used in setup.py).

Adding this file with these 5-10 lines is enough already to start the installation project. However, a couple of errors are thrown and I think they may be due to my machine being an ARM mac. I believe they are due to 1. me not having ninja on my machine and 2. since I my laptop does not have a GPU, torch needs numpy. I wouldn't really know how to "conditionally" include those in the build-system table.

Perhaps @doctorpangloss you could test building on your own machine? The following steps are needed:

  1. gh repo clone baggiponte/xformers or any other way of cloning the repo.
  2. git submodule update --init --recursive.
  3. Switch to a temporary directory.
  4. Create a venv and activate it
  5. Run pip install /path/to/cloned/xformers/repo

@danthe3rd
Copy link
Contributor

Hi,
We already have such a PR opened here: #743
The TLDR is that we can't do that at the moment, as using pyproject causes the build to happen in a isolated environment, which might have a different version of PyTorch. And xFormers won't work if it was built with a different version of PyTorch.
At least the current setup ensures that we always build for the right version of PyTorch - even though it might fail because torch is not yet installed.
cc @fmassa

@baggiponte
Copy link

Hey there, thanks for the reply.

Hi, We already have such a PR opened here: #743 The TLDR is that we can't do that at the moment, as using pyproject causes the build to happen in a isolated environment, which might have a different version of PyTorch. And xFormers won't work if it was built with a different version of PyTorch.

Thanks for the explainer. Why do you say you don't think the isolated env might have a different PyTorch version? What if you just pin the necessary pytorch version in the build-system table? It would require updating with every release, of course, but in the end the end user already has to do that because they can't just pip install torch if the version matters. Please correct me if I am wrong.

At least the current setup ensures that we always build for the right version of PyTorch - even though it might fail because torch is not yet installed. cc @fmassa

I am afraid I don't understand what you mean by the right version of PyTorch. What do you think about adding a try-catch in the setup.py that raises an error and informs users to install a specific version of pytorch?

@danthe3rd
Copy link
Contributor

The issue is that if the isolated environment will have a version of PyTorch X, xFormers will be built for PyTorch version X only. If the environment in which the user wants to install xFormers will have version Y, and Y != X, the install will be broken.

What if you just pin the necessary pytorch version in the build-system table?

We would also need to pin that same version inside the requirements.txt file for xFormers, and technically this would work. However, some users need to use a previous version of PyTorch (or a more recent one like the nightly version), and this setup wouldn't work for them - currently they can install any version of PyTorch and then build xFormers for it from source.

@baggiponte
Copy link

Oh okay, I think I get it now. xformers needs to be built with the same version as the one the user has installed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants