Bug/sg 896 add deprecation for previous breaking changes #1121
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Added deprecations, and backward compatibility to previous breaking changes.
In this PR:
InfiniteSampler has been removed, added deprecation using equivalent DistributedSampler.
Broken imports
from super_gradients.training.utils.optimizers.all_optimizers import OPTIMIZERS
from super_gradients.training.models import BasicBlock, Bottleneck, HpmStruct
DDRNetCustom: the init expected a parameter “aux_head” from “arch_params.aux_head”. In the new version of SG, this parameter is: “use_aux_heads=arch_params.use_aux_heads” - added deprecation.
BasicDDRBackBone . In old SG version, this block didn’t expect layer3_repeats . Now SG expects it. Also, ddrnet model now assumes that “arch_params” has layer3_repeats parameter. Fixed by setting default=1.
Verified backward compatibility handled outside this PR:
Some were already handled in other PRs, this is the complete list + where they were handled:
ema_params
, they had:exp_activation: True
. Changed todecay_type: exp
handled by @BloodAxe in 69a82bccustom_stdc
name was changed tostdc_custom
. Handled by @ofrimasad in 190283efrom super_gradients.training.models import make_divisible
, handled by @BloodAxe in 6b5785dtrainer = Trainer(experiment_name=cfg.experiment_name, multi_gpu=cfg.multi_gpu)
. Changed to usesetup_device(()
instead - this is the only case where we throw an error, to avoid setting devices inside the Trainer. f50d438CustomizableDetector
Used to callsuper().__init__
only witharch_params
. Now its changed to accept each parameter separately:handled by @Louis-Dupont in a3618e8