You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running train_id_net_res_market_new, missing 'label' layer will cause error, so it's needed to uncomment this lines when train base branch.
When running train_id_net_res_market_align, this code block should be commented, otherwise another error will be thrown out saying layer 'local_fc751's der is empty. But commenting this block also causes an error saying that 'label_local' layer is missed.
Hope for a better instruction for this code.
The text was updated successfully, but these errors were encountered:
Hi,
You're right. I add more illustration in the README.
Do the problem still keep appear?
When training the basic network, we need the loss and dropout layers.
When training the whole network, we need to remove the loss in the resnet52_market.m. We just want the net structure and params from ImageNet.
In net2, we need top1 loss for label_local, in net3, we don't need the losses. If that's right, there exists a conflict when training the align net containing net2 and net3. My solution is commenting out the loss in resnet52_market, and adding the loss layer additionally only for net2 in train_id_net_res_market_align, then the training run successfully, but this is different from the origin code so I am still confused about the correctness of the solution.
In resnet52_market, fc ans softmax layers are commented.
When running train_id_net_res_market_new, missing 'label' layer will cause error, so it's needed to uncomment this lines when train base branch.
When running train_id_net_res_market_align, this code block should be commented, otherwise another error will be thrown out saying layer 'local_fc751's der is empty. But commenting this block also causes an error saying that 'label_local' layer is missed.
Hope for a better instruction for this code.
The text was updated successfully, but these errors were encountered: