-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PSPNet final conv #29
Comments
Hi @NiklasDL |
P.S. If you would test both cases, please, let me know your results. Thanks. 😄 |
Hi @qubvel will take some time because my model trains for almost two weeks but I will let you know :) |
Hi @qubvel, the difference between both cases is slightly noticeable. Please find the results for both models for the same random subset of 200 test images (binary segmentation problem, IoU-threshold optimized results). The training data contains roughly 600k images. |
Hi @NiklasDL, |
Hi qubvel,
are you sure that the final conv layer of your PSPNet implementation is correct?
yields for the last couple of layers:
concatenate_1 (Concatenate) (None, 48, 48, 1152) 0 stage3_unit1_relu1[0][0]
resize_image_1[0][0]
resize_image_2[0][0]
resize_image_3[0][0]
resize_image_4[0][0]
conv_block_conv (Conv2D) (None, 48, 48, 512) 589824 concatenate_1[0][0]
conv_block_bn (BatchNormalizati (None, 48, 48, 512) 2048 conv_block_conv[0][0]
conv_block_relu (Activation) (None, 48, 48, 512) 0 conv_block_bn[0][0]
spatial_dropout2d_1 (SpatialDro (None, 48, 48, 512) 0 conv_block_relu[0][0]
final_conv (Conv2D) (None, 48, 48, 1) 4609 spatial_dropout2d_1[0][0]
resize_image_5 (ResizeImage) (None, 384, 384, 1) 0 final_conv[0][0]
sigmoid (Activation) (None, 384, 384, 1) 0 resize_image_5[0][0]
conv_block_conv has 589.824 params, I guess these originate from 512 1 * 1 * 1152 convolutions.
And final_conv (Conv2D) has 4609 params, likely constructed from 1 3 * 3 * 512 conv.
Shouldnt it be the other way around? conv_block_conv having 3 * 3 convs and the final layer having 1 * 1 convs? Sorry if I am mistaken here.
The text was updated successfully, but these errors were encountered: