Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecates implicit cache keys #6083

Merged
merged 4 commits into from
Jan 16, 2024
Merged

Deprecates implicit cache keys #6083

merged 4 commits into from
Jan 16, 2024

Conversation

arcanis
Copy link
Member

@arcanis arcanis commented Jan 15, 2024

What's the problem this PR addresses?

The lockfile has a very small optimization feature: instead of repeating the cache key across all checksums, it keeps a global cache key at the top of the file, which it uses everytime a checksum is missing a cache key. This saves a few bytes.

However, I found out this behaviour causes breakages under some situations:

  • Alice creates a feature branch from main and add a new dependency foo
  • Bob upgrades Yarn to a new version that compiles zip files differently
  • Alice merges the new main into her feature branch
  • Git merges the two lockfiles together; let's say there are no conflicts
  • The foo entry added by Alice has no cache key, since it was added at a time its cache key matched the global one
  • However it doesn't match the new global one, which was updated at the top of the file

How did you fix it?

The checksum cache key is now always set.

Checklist

  • I have set the packages that need to be released for my changes to be effective.
  • I will check that all automated PR checks pass before the PR gets reviewed.

@arcanis arcanis merged commit bd545d8 into master Jan 16, 2024
24 checks passed
@arcanis arcanis deleted the mael/lockfile-cache-key branch January 16, 2024 08:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant