Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Smoothgrad outputs only zeros for a batch of inputs #246

Open
nkoenen opened this issue May 5, 2021 · 0 comments
Open

Smoothgrad outputs only zeros for a batch of inputs #246

nkoenen opened this issue May 5, 2021 · 0 comments
Assignees
Labels

Comments

@nkoenen
Copy link

nkoenen commented May 5, 2021

Hello!

I tried to analyze a batch of inputs with the 'smoothgrad' method and neuron_selection_mode = "index" with your package iNNvestigate (version 1.0.9). Surprisingly, only the output for the first input in the batch made any sense, because the gradients for all other values were set to 0. I think there's something wrong with your code.

Here is a small example:
I generated a classification dataset with sklearn with 5 input features and 2 classes. On this dataset, I trained a keras model with 3 dense layers. Then I removed the softmax activation and created the analyzers for the methods 'gradient' and 'smoothgrad'. I also set the noise scale (noise_scale = 1e-8) to a very small value for the smoothgrad method, which makes the results almost identical to the normally calculated gradients. Below you can find the code for my example:

import numpy as np
np.random.seed(1234)

from sklearn.datasets import make_classification

import keras # version 2.2.4
import innvestigate # version 1.0.9
import innvestigate.utils as iutils

n_inputs = 5
n_classes = 2
n_samples = 512

# Generate classification data
data_x, data_y = make_classification(n_samples=n_samples, n_features=n_inputs, n_classes=n_classes)
data_y = keras.utils.to_categorical(data_y)

# Define keras model and fit it
model = keras.models.Sequential([
    keras.layers.Dense(16, activation="relu", input_shape=(n_inputs,)),
    keras.layers.Dense(8, activation="relu"),
    keras.layers.Dense(n_classes, activation="softmax"),
])
model.compile(loss="categorical_crossentropy", optimizer="adam", metrics=["accuracy"])
model.fit(data_x, data_y, epochs=25, batch_size=128)


# Start the analysis with iNNvestigate
model = iutils.keras.graph.model_wo_softmax(model)

# Set the noise level sufficiently low such that almost identical values should come out for the methods
# 'gradient' and 'smoothgrad'
noise_scale = 1e-8

# Define the analyzer for each method
analyzer_smoothgrad = innvestigate.create_analyzer("smoothgrad",
                                                   model,
                                                   noise_scale = noise_scale,
                                                   neuron_selection_mode="index")
analyzer_gradient = innvestigate.create_analyzer("gradient",
                                                   model,
                                                   neuron_selection_mode="index")

# One input for output neuron '0'
result_smoothgrad = analyzer_smoothgrad.analyze(data_x[1:2,:], 0)
result_gradient = analyzer_gradient.analyze(data_x[1:2,:], 0)
print("Mean squared error (one input): {:4f}".format(np.mean((result_gradient - result_smoothgrad)**2)))

# Multiple inputs for output neuron '0'
result_smoothgrad = analyzer_smoothgrad.analyze(data_x[1:3,:], 0)
result_gradient = analyzer_gradient.analyze(data_x[1:3,:], 0)
print("Mean squared error (multiple inputs): {:4f}".format(np.mean((result_gradient - result_smoothgrad)**2)))

print("Result smoothgrad:")
print(result_smoothgrad)

print("Result gradient:")
print(result_gradient)

Here is the output from the example and as you can see the output of 'smoothgrad' for the second input makes no sense:

Mean squared error (one input): 0.000000
Mean squared error (multiple inputs): 0.225435
Result smoothgrad:
[[ 0.50317013  0.72440994  0.59556705 -1.5030893  -0.39739025]
 [ 0.          0.          0.          0.          0.        ]]
Result gradient:
[[ 0.5031703   0.72440976  0.5955669  -1.5030882  -0.39739007]
 [ 0.46457082 -0.31459585  0.5895394  -1.172667   -0.46567515]]

I hope I didn't make a mistake and that I could help you to improve this wonderful package.

Best regards,
Niklas

@adrhill adrhill self-assigned this May 18, 2021
@adrhill adrhill added the bug label Jun 25, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants