Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix mypy errors attributed to pytorch_lightning.trainer.connectors.callback_connector.py #13750

Merged
merged 17 commits into from
Aug 8, 2022
Merged

Fix mypy errors attributed to pytorch_lightning.trainer.connectors.callback_connector.py #13750

merged 17 commits into from
Aug 8, 2022

Conversation

krishnakalyan3
Copy link
Contributor

@krishnakalyan3 krishnakalyan3 commented Jul 20, 2022

Part of #13445

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Jul 20, 2022
@krishnakalyan3 krishnakalyan3 marked this pull request as draft July 20, 2022 05:37
@otaj otaj mentioned this pull request Jul 20, 2022
52 tasks
Copy link
Contributor

@otaj otaj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are still couple parts that are not working well for mypy, even with applied suggestions. In the method _configure_accumulated_gradients, this could be solved by differentiating between grad_accum_callbacks and grad_accum_callback (i.e. two separate variables, one for the case when it's a list and second for when it's a single instance)

@otaj
Copy link
Contributor

otaj commented Jul 27, 2022

I think that src/pytorch_lightning/trainer/connectors/callback_connector.py:269: error: Incompatible types in assignment (expression has type "Generator[pkg_resources.EntryPoint, None, None]", variable has type "Union[List[importlib.metadata.EntryPoint], Tuple[]]") [assignment] will have to be silenced, because there's no reasonable fix to it

krishnakalyan3 and others added 4 commits July 29, 2022 11:04
Co-authored-by: otaj <6065855+otaj@users.noreply.github.com>
Co-authored-by: otaj <6065855+otaj@users.noreply.github.com>
Co-authored-by: otaj <6065855+otaj@users.noreply.github.com>
@krishnakalyan3 krishnakalyan3 marked this pull request as ready for review July 29, 2022 08:09
@krishnakalyan3
Copy link
Contributor Author

Thank you for the reviews @otaj

@krishnakalyan3 krishnakalyan3 requested review from otaj and removed request for SeanNaren July 29, 2022 09:14
@otaj otaj added this to the pl:1.7.x milestone Jul 29, 2022
@codecov
Copy link

codecov bot commented Jul 29, 2022

Codecov Report

Merging #13750 (00fa20a) into master (aefb9ab) will increase coverage by 14%.
The diff coverage is 100%.

@@            Coverage Diff            @@
##           master   #13750     +/-   ##
=========================================
+ Coverage      61%      75%    +14%     
=========================================
  Files         335      341      +6     
  Lines       26301    28040   +1739     
=========================================
+ Hits        16048    21111   +5063     
+ Misses      10253     6929   -3324     

@krishnakalyan3 krishnakalyan3 requested a review from otaj July 30, 2022 07:39
Co-authored-by: otaj <6065855+otaj@users.noreply.github.com>
@mergify mergify bot added the ready PRs ready to be merged label Aug 3, 2022
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
@Borda Borda enabled auto-merge (squash) August 8, 2022 08:23
@Borda Borda merged commit 5271ed9 into Lightning-AI:master Aug 8, 2022
@awaelchli awaelchli modified the milestones: pl:1.7.x, pl:1.8 Aug 9, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pl Generic label for PyTorch Lightning package ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants