Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better visualization of the model using torchsummaryX #1833

Closed
IncubatorShokuhou opened this issue May 14, 2020 · 11 comments · Fixed by #4521
Closed

Better visualization of the model using torchsummaryX #1833

IncubatorShokuhou opened this issue May 14, 2020 · 11 comments · Fixed by #4521
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Milestone

Comments

@IncubatorShokuhou
Copy link

🚀 Feature

Better visualization of params,size et.al. using torchsummaryX

Motivation

Now in pytorch-lightning, visualization of the model is generated according to the layers listed in __build_model function. However, the layers in the function is not necessarily used, and not necessarily used for only one time, which may make the visualization of the model incorrcet.

Pitch

In my point of view, torchsummaryX could be intergrated into pytorch-lightning.
Torchsummary X is an improved visualization tool of torchsummary. It visualizes kernel size, output shape, params, and Mult-Adds. It genreates information of the model by actually runnning the forward function, which make the information of the model more accurent. Also, it has a little bit more information than the ModelSummary function now using in pytorch-lightning.

Alternatives

Additional context

@IncubatorShokuhou IncubatorShokuhou added feature Is an improvement or enhancement help wanted Open to be worked on labels May 14, 2020
@Borda
Copy link
Member

Borda commented May 14, 2020

cc: @PyTorchLightning/core-contributors

@awaelchli
Copy link
Contributor

I have a PR in the works #1773 that does input/output shapes properly (probably very similar to torchsummary) . With the layer summary class I added, it should be easy to extend it further to display weight shapes etc. without us needing to add a dependency on a library torchsummaryX. I suggest to go this path.
See also #1556

@stale
Copy link

stale bot commented Jul 13, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the won't fix This will not be worked on label Jul 13, 2020
@Borda
Copy link
Member

Borda commented Jul 13, 2020

@awaelchli so is this done? :]

@stale stale bot removed the won't fix This will not be worked on label Jul 13, 2020
@awaelchli
Copy link
Contributor

awaelchli commented Jul 13, 2020

Depends how far we want to go. Currently our model summary shows num params, layer type, input- and output shapes.
@IncubatorShokuhou mentions that torchsummaryX also shows

  • parameter tensor shapes
  • mult-adds

If we don't want these, we can close the issue.

@stale
Copy link

stale bot commented Sep 11, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the won't fix This will not be worked on label Sep 11, 2020
@Borda Borda removed the won't fix This will not be worked on label Sep 11, 2020
@dav-ell
Copy link

dav-ell commented Oct 17, 2020

Was there not enough interest in this to continue? I'm interested in this being implemented.

@Borda
Copy link
Member

Borda commented Oct 17, 2020

@dav-ell sure, do you want to take it over?

@awaelchli
Copy link
Contributor

@dav-ell sure go ahead. Please add me as reviewer when you send the PR. cheers!

@george-gca
Copy link
Contributor

Adding as a reminder for myself (or anyone who tackles this issue first): this should be made here. Probably doing the total values calculation before calling the function here, adding a new parameter to the function. The num of parameters for each layer is already stored in self.param_nums, but a num_trainable_parameters property would be useful in LayerSummary class. This could be achieved doing the same as num_parameters, but also checking if p.requires_grad is True. I couldn't find a test for this feature though. Maybe I missed something?

@stale
Copy link

stale bot commented Nov 18, 2020

This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team!

@stale stale bot added the won't fix This will not be worked on label Nov 18, 2020
@Borda Borda added this to the 1.2 milestone Nov 19, 2020
@stale stale bot removed the won't fix This will not be worked on label Nov 19, 2020
@Borda Borda modified the milestones: 1.2, 1.1 Nov 30, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants