Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor: weight_decay moved to config.hyperparameters section #433

Merged
merged 1 commit into from
Jun 9, 2023

Conversation

GustavBaumgart
Copy link
Collaborator

Description

weight_decay was previously used as a parameter for the FedDyn algorithm. However, since it is a common parameter for neural networks, it will be included in the config's hyperparameters section (optional to specify).

This parameter has been removed from the FedDyn optimizer and FedDyn regularizer as well.

Type of Change

  • Bug Fix
  • New Feature
  • Breaking Change
  • Refactor
  • Documentation
  • Other (please describe)

Checklist

  • I have read the contributing guidelines
  • Existing issues have been referenced (where applicable)
  • I have verified this change is not present in other open pull requests
  • Functionality is documented
  • All code style checks pass
  • New code contribution is covered by automated tests
  • All new and existing tests pass

@codecov-commenter
Copy link

codecov-commenter commented Jun 8, 2023

Codecov Report

Merging #433 (844831f) into main (11cc3b8) will not change coverage.
The diff coverage is n/a.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

@@           Coverage Diff           @@
##             main     #433   +/-   ##
=======================================
  Coverage   14.87%   14.87%           
=======================================
  Files          48       48           
  Lines        2830     2830           
=======================================
  Hits          421      421           
  Misses       2380     2380           
  Partials       29       29           

Copy link
Collaborator

@jaemin-shin jaemin-shin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, except a minor suggested changes. Glad that formatter is working, but also was a bit difficult to identify actually changed logic from this PR. Maybe we should go over the whole library code with formatter on for another PR..

@@ -21,7 +21,7 @@
from flame.common.constants import TrainState
from diskcache import Cache
from ..common.typing import ModelWeights
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
from ..common.typing import ModelWeights
from flame.common.typing import ModelWeights

@@ -21,7 +21,7 @@
from flame.common.constants import TrainState
from diskcache import Cache
from ..common.typing import ModelWeights
from ..common.util import (MLFramework, get_ml_framework_in_use)
from ..common.util import MLFramework, get_ml_framework_in_use
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
from ..common.util import MLFramework, get_ml_framework_in_use
from flame.common.util import MLFramework, get_ml_framework_in_use

weight_decay was previously used as a parameter for the FedDyn algorithm.
However, since it is a common parameter for neural networks, it will be included in the config's hyperparameters section (optional to specify).

This parameter has been removed from the FedDyn optimizer and FedDyn regularizer as well.
Copy link
Collaborator

@jaemin-shin jaemin-shin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@GustavBaumgart GustavBaumgart merged commit 4a5f28c into cisco-open:main Jun 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants