Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix is_layer_at_idx for LRP #308

Merged
merged 3 commits into from
May 5, 2023
Merged

Conversation

Rubinjo
Copy link
Contributor

@Rubinjo Rubinjo commented Feb 10, 2023

Fixes the is_layer_at_idx function by binding the loop variable i to the lambda function and looping through the model's layers to find the corresponding index.

Closes #264

Example of fix

I have used a simple CNN network to test the implementation, see below the model summary.

model_summary

I used the following LRP settings.

analyzer = LRP(
    model,
    rule="Z",
    input_layer_rule="Flat",
    until_layer_idx=3,
    until_layer_rule="Epsilon",
)

For verification, I printed the rule object and the layer it will be applied to.

layer+rule

As you can see all three rules are being applied.

Discussion

The until_layer_idx argument will now count every layer in your model, also Input, Reshape, Pooling, etc.

@adrhill
Copy link
Collaborator

adrhill commented Feb 27, 2023

Thanks for the contribution, this looks good!

Could you add your example to the package tests, e.g. by adding a new file tests/backend/test_layer_idx.py?

@Rubinjo
Copy link
Contributor Author

Rubinjo commented Feb 28, 2023

I have added a basic test similar to the example I discussed in my previous comment.

@adrhill
Copy link
Collaborator

adrhill commented May 5, 2023

Looks good to me and tests passed locally. Thanks! :)

@adrhill adrhill merged commit 98260ea into albermax:master May 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

is_layer_at_idx not implemented
2 participants