Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Negative outputs in Deconvnet and Guided BackProp #235

Open
venki-lfc opened this issue Jan 12, 2021 · 0 comments
Open

Negative outputs in Deconvnet and Guided BackProp #235

venki-lfc opened this issue Jan 12, 2021 · 0 comments

Comments

@venki-lfc
Copy link

Hello Alber,

I am trying to analyse a simple model using deconvnet and guidedbackprop. I have a few doubts regarding the implementation of these two methods.

My model looks like this:

model = tf.keras.Sequential([
    tf.keras.layers.Dense(30, activation='relu', input_shape=(22,)),
    tf.keras.layers.Dense(3, activation='relu'),
    tf.keras.layers.Dense(1)
])

When I analyse the model using GuidedBackProp one my results look like this:

[ 0.11697155 -0.22231908 0.02728167 0.23650022 0.2358752 -0.01217415 -0.26119494 -0.16302212 -0.05665719 -0.06339239 -0.03417799 0.28304198 0.00337575 -0.11469954 -0.11458022 0.02820442 0.34076804 0.22410852 0.08049499 0.18145919 -0.30921796 0.07817852]

As you can see there are positive and negative values. Here is my doubt: As per my understanding, based on Springenberg et al., the Guided backprop method propagates back only positive gradients as relu is applied for both forward and backward passes.

As per The figure 1 in https://arxiv.org/pdf/1412.6806.pdf , the final output shouldn't have any negative values but in our case we have (for both guided backprop and deconvnet). Could you please tell me why is this happening?

Question 2: Also, instead of ReLU, if I want to analyse for tanh activations, will the following implementation work for guided backprop?

def guided_backprop_mapping(X,Y, bp_Y, bp_state):
    tmp =  tf.compat.v1.nn.tanh(bp_Y)
    return tf.compat.v1.gradients(Y,X,grad_ys=tmp)

class GuidedBackProp(innvestigate.analyzer.base.ReverseAnalyzerBase):
    
    def _create_analysis(self, *args, **kwargs):
        self._add_conditional_reverse_mapping(lambda layer: innvestigate.utils.keras.checks.contains_activation(layer, "tanh"),
            tf_to_keras_mapping(guided_backprop_mapping),
            name="guided_backprop_reverse_relu_layer",)
        return super(GuidedBackProp, self )._create_analysis(*args ,**kwargs)

Thank you very much in advance.

Best regards,
Venki

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant