Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question: why replace keras.activations.relu to tf.nn.relu #35

Open
lujingqiao opened this issue Apr 1, 2021 · 0 comments
Open

question: why replace keras.activations.relu to tf.nn.relu #35

lujingqiao opened this issue Apr 1, 2021 · 0 comments

Comments

@lujingqiao
Copy link

lujingqiao commented Apr 1, 2021

fitst, thank for share the code, learn a lot.
but i has a question: can you explain why repace keras.activations.relu?
thank you!

`def modify_backprop(model, name):
g = tf.get_default_graph()
with g.gradient_override_map({'Relu': name}):

    # get layers that have an activation
    layer_dict = [layer for layer in model.layers[1:] if hasattr(layer, 'activation')]

    # replace relu activation
    for layer in layer_dict:
        if layer.activation == keras.activations.relu:
            layer.activation = tf.nn.relu

    # re-instanciate a new model
    new_model = VGG16(weights='imagenet')
return new_model`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant