Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Getting negative LRP relevance value for Alpha_1_beta_0 rule when analyzing the ResNet model #326

Open
n33lkanth opened this issue Jan 28, 2024 · 0 comments
Labels
triage Bug report that needs assessment

Comments

@n33lkanth
Copy link

n33lkanth commented Jan 28, 2024

Hi,
I am facing problem in visualizing the heatmaps when I train ResNet model and use the LRP for visualizing the heatmaps of the cats and dog image dataset. Below I have attached the heatmap of the lrp relevance (analyzer output) on left side , masked lrp on top of original image (right side image) and also attached is the distribution plot of the relevance matrix (analyzer output relevance values). I found out that for many images I am getting negative relevances even if I use alpha_1_beta_0 (consider only positive relevances) rule. I do not face this issue when I use simple CNN mode, VGG16 model. What could be the reason of the negative relevance in ResNet for alpha_1_1bata_0 rule? How to fix it. Kindly help me to fix it.

Thanks in advance

#Resnet Tiny Model
import numpy as np
import tensorflow as tf
from tensorflow.keras.layers import Input, Conv2D, BatchNormalization, ReLU, Add, AveragePooling2D, Flatten, Dense
from tensorflow.keras.models import Model
from tensorflow.keras.utils import plot_model

def residual_block(x, filters, stride=1):
    shortcut = x

    x = Conv2D(filters, kernel_size=(1, 1), strides=stride, padding='same', data_format='channels_last')(x)
    x = BatchNormalization()(x)
    x = ReLU()(x)

    x = Conv2D(filters, kernel_size=(3, 3), padding='same')(x)
    x = BatchNormalization()(x)

    if stride != 1:
        shortcut = Conv2D(filters, kernel_size=(1, 1), strides=stride)(shortcut)
        shortcut = BatchNormalization()(shortcut)

    x = Add()([x, shortcut])
    x = ReLU()(x)
    return x

def tiny_resnet(input_shape, num_classes, **modelParams):
    print('input_shape: ', input_shape, input_shape[1:]) #input_shape: (1800, 124, 124, 3)
    inputs = Input(shape=input_shape[1:])

    x = Conv2D(64, kernel_size=(5, 5), strides=2, padding='same')(inputs)
    x = BatchNormalization()(x)
    x = ReLU()(x)
    x = AveragePooling2D(pool_size=(3, 3), strides=2, padding='same', data_format='channels_last')(x)

    # Only one residual block
    x = residual_block(x, 64)

    x = AveragePooling2D(pool_size=(3, 3))(x)

    x = Flatten()(x)
    x = Dense(num_classes, activation='softmax')(x)

    model = Model(inputs=inputs, outputs=x)
    plot_model(model, to_file='tiny_resnet_diagram.png', show_shapes=True, show_layer_names=True)
    return model

Expected behavior

I should get only the positive relevance values as I am using the lrp.alpha_1_beta_0 rule.

Screenshots

If applicable, add screenshots to help explain your problem.
12136_cat
12136_cat_dist_preNorm

Original test Image:
12136

Platform information

  • OS: [Windows 11]
  • Python version: [3.8]
  • iNNvestigate version: [version 1, and also with v2.1.2]
  • TensorFlow version: [2.5]

Model Trained on Cat and Dog Dataset

model.zip

@n33lkanth n33lkanth added the triage Bug report that needs assessment label Jan 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
triage Bug report that needs assessment
Projects
None yet
Development

No branches or pull requests

1 participant