You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How can I extract the metrics returned from training_step, training_epoch_end, validation_step, validation_epoch_end, test_step, test_epoch_end after a train() or a test() run?
I'd like to return some dictionary (e.g. {'loss': ..., 'log': ..., 'param a: 'a', 'param b': 'b', 'param c': {...}}) from e.g. test_epoch_end and retrieve this after calling trainer.test(net). It seems like the data is available some where, as the Weights and Biases logger prints at least the training metrics before uploading. Where can I find those metrics from training and testing?
The text was updated successfully, but these errors were encountered:
It seems it's a duplicate. See #1694 .
BTW, If you need just postprocessing, I hacked this problem by putting everything (including plotting, I/O, etc.) in test_epoch_end . I think it's not best practice but I also couldn't find any method to get the test result.
I just write the file that records what I need. After then, I write another script to process the file. I wish this helps you.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
❓ Questions and Help
How can I extract the metrics returned from
training_step
,training_epoch_end
,validation_step
,validation_epoch_end
,test_step
,test_epoch_end
after atrain()
or atest()
run?I'd like to return some dictionary (e.g.
{'loss': ..., 'log': ..., 'param a: 'a', 'param b': 'b', 'param c': {...}}
) from e.g.test_epoch_end
and retrieve this after callingtrainer.test(net)
. It seems like the data is available some where, as the Weights and Biases logger prints at least the training metrics before uploading. Where can I find those metrics from training and testing?The text was updated successfully, but these errors were encountered: