You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The PyTorch model in shadow_metric.ipynb uses nn.CrossEntropyLoss, which expects unnormalized logits. However, the model outs probabilities due to the use of nn.Softmax. This causes the model to not achieve 100% accuracy on the training set.
Additionally, criterions in PyTorch typically take arguments in the order of logits, targets. However, the code provides targets, logits. This, however, is not a functional concern because targets contains class probabilities (rather than class indices). It will probably become an issue once the bug above is fixed.
Hi @dxoigmn, thank you for bringing this issue to our attention. I'm pleased to inform you that we have fixed the bug and released an updated version of the tutorials. You can check the shadow_metric.ipynb and avg_loss_training_alg.ipynb files for the latest version.
If you have any questions or feedback, please let us know. Thank you for your support!
The PyTorch model in shadow_metric.ipynb uses
nn.CrossEntropyLoss
, which expects unnormalized logits. However, the model outs probabilities due to the use ofnn.Softmax
. This causes the model to not achieve 100% accuracy on the training set.Additionally, criterions in PyTorch typically take arguments in the order of
logits, targets
. However, the code providestargets, logits
. This, however, is not a functional concern because targets contains class probabilities (rather than class indices). It will probably become an issue once the bug above is fixed.Both of these issues also exist in avg_loss_training_algo.ipynb
The text was updated successfully, but these errors were encountered: