Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How LRP is used in ResNet? #192

Open
syomantak opened this issue Jun 8, 2020 · 3 comments
Open

How LRP is used in ResNet? #192

syomantak opened this issue Jun 8, 2020 · 3 comments

Comments

@syomantak
Copy link

This is actually a theoretical question. How are the values propagated back in ResNet where there is skip connection?
Is the value divided equally between the previous layer neuron and the neurons of the layer from which skip connection is done? If so, then the assertion that sum of the relevance of all neurons in every layer is equal would be wrong as some relevance has 'leaked' to the initial layers due to skip connection.

@bernerprzemek
Copy link

I don't know if this is correct, but in my case I just add branches relevance an simply divide by 2. Theoretically both branches have same input (which later are added together) so relevance should be sum of branches values divided by number of branches.
But this is very interesting question how to treat this situation. And also (I'm trying LRP on EfficientNet) how to treat skip connections with multiplication on the end. I was trying to find answer for some time, but seams that LRP is usually used to explain sequential architectures.

@syomantak
Copy link
Author

@bernerprzemek but if what you say is true then the so-called 'identity' that sum of relevance values for each layer remains constant would no longer be true! Anyway, I have used a different XAI method for my research so I do not need the answer as such.

@bernerprzemek
Copy link

Yes probably not, but keep in mid that residual connection isn't simply y=f(x) but y=f(x)+x, and this fact has to be included in conservation rule, the same happen in batchnormalization layer where sum around all relevance values are no longer same as in previous layer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants