-
-
Notifications
You must be signed in to change notification settings - Fork 610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backtracking tries to download all versions of package #2044
Comments
I sometimes add a separate file with extra constraints at the top of the input file. As in |
@webknjaz i have script which temporary removes problematic dependencies. They are automatically added with correct version in dependency build process. I can solve this problem easily when i know what exactly is the problem, but it's not always so obvious. I have solved this problem for my case. My intent is to start discussion to make pypi package registry more efficient. Let's look how fast can npm resolve packages. It's because npm registry sends package metadata for all versions in one request. This is good start. I would like to see version grouping, so if some range has same dependencies, then it is not necessary to send that same list for all versions. |
First of all, You can force pip to disregard sdists using |
@webknjaz Yes, metadata is served per version, this is part of the problem. NPM for example serves all dependencies for packages in single request. Instead of this, pip-tools (pip) needs hundreds of request just to check package dependencies (this is in my first message). Second problem is something, which should be eliminated in future. It was really bad decision to allow code execution even for mostly static metadata. I don't know if this is best place to discuss this, but it's problem which directly affects pip-tools. It can't be properly fixed by pip-tools, but it's possible to reduce backtracking depth, maybe try lower dependents versions eagerly instead of just trying downgrade first problematic package to any old version. |
It's not going to be possible to eliminate building sdists. Like ever. The reasons are explained in the article. It'd simply break the ecosystem. If you want to draft a PR with more concrete ideas, you can try. But I don't see anything actionable here. |
Let's assume, i have already this
requirements.txt
file:Now i add
boto3
torequirements.in
:Now command
pip-compile --resolver=backtracking -v requirements.in
tries to download every metadata file or every package file (if i am using pip servers directly, not caching servers on my local network):In this case it takes hours of downloading.
From persective of resolver, is correct to try other versions.
I don't know if this can be solved in some smart way. I might suggest to not check older versions if installed package version is already higher than maximal version defined in botocore (in this case), but this would not be correct.
This is more discussion topic, than real bug report, because i think,
pip-tools
behaves correctly. Maybe best solution would be extend pip server metadata to include all dependencies of package on single request and correctly grouped to version ranges, so it would be not one gigantic file. Instead of this, it would contain sections like'>=1.0,<1.2': {'dependencies': ...
.The text was updated successfully, but these errors were encountered: