Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

se should be added after inception block, right? #1

Open
dpcross opened this issue Jan 30, 2018 · 4 comments
Open

se should be added after inception block, right? #1

dpcross opened this issue Jan 30, 2018 · 4 comments

Comments

@dpcross
Copy link

dpcross commented Jan 30, 2018

hi, I think squeeze_excitation_layer should be added after inception block ,not inception_resnet block
Am I right ?

@taki0112
Copy link
Owner

you mean Reduction A, B, C ?

@dpcross
Copy link
Author

dpcross commented Jan 30, 2018

@taki0112 not reduction a,b,c . I mean it should be like :
x = squeeze_excatation_layer(x)
x = x * 0.1
x = init + x
should not be added after the inception_resnet_a ,b ,c
I change the structure and it performance much better

@taki0112
Copy link
Owner

The SE block does not matter where you put it.
In the case of keras code, he put it after the inception block you mentioned and put it after the reduction block.

@dpcross
Copy link
Author

dpcross commented Jan 30, 2018

Actually I am not quite sure whether it matters or not , but the resnet module show here https://github.com/hujie-frank/SENet/blob/master/figures/SE-ResNet-module.jpg .
so I thought the module define like this might be better and it did work in my project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants