You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
aimet_torch has feature to freeze activations with load_and_freeze_encodings() and it goes correctly after applying these changes: #2845.
aimet_tensorflow has just set_and_freeze_param_encodings() for parameter freezing but atleast I don't know the way to freeze activations. Sometimes it is beneficial to freeze input and output activations based on normalization and used bitwidth and it should be handled correctly during QAT. It would be great feature to have also in aimet_tensorflow!
The text was updated successfully, but these errors were encountered:
And for my purposes aimet_tensorflow's keras implementation would be what I need but of course someone else would need it for original aimet_tensorflow.
aimet_torch has feature to freeze activations with load_and_freeze_encodings() and it goes correctly after applying these changes: #2845.
aimet_tensorflow has just set_and_freeze_param_encodings() for parameter freezing but atleast I don't know the way to freeze activations. Sometimes it is beneficial to freeze input and output activations based on normalization and used bitwidth and it should be handled correctly during QAT. It would be great feature to have also in aimet_tensorflow!
The text was updated successfully, but these errors were encountered: