Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

replace the AdagradOptimizer 、adamaxOptimizer、AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer and Momentum #54152

Merged
merged 20 commits into from
Jul 11, 2023

Conversation

longranger2
Copy link
Contributor

@longranger2 longranger2 commented May 28, 2023

PR types

Others

PR changes

APIs

Description

  • remove the AdagradOptimizer in paddle/fluid/optimizer.py and use paddle/optimizer/adagrad.py replace it
  • remove the AdamaxOptimizer in paddle/fluid/optimizer.py and use paddle/optimizer/adamax.py replace it
  • remove the AdadeltaOptimizer in paddle/fluid/optimizer.py and use paddle/optimizer/adadelta.py to replace it.
  • remove the RMSPropOptimizer in paddle/fluid/optimizer.py and use paddle/optimizer/rmsprop.py to replace it.
  • remove the LambOptimizer in paddle/fluid/optimizer.py and use paddle/optimizer/lamb.py to replace it.
  • remove the Momentum in paddle/fluid/contrib/optimizer.py and use python/paddle/optimizer/momentum.py to replace it.

@paddle-bot
Copy link

paddle-bot bot commented May 28, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added contributor External developers status: proposed labels May 28, 2023
@paddle-bot
Copy link

paddle-bot bot commented May 28, 2023

✅ This PR's description meets the template requirements!
Please wait for other CI results.

@longranger2 longranger2 changed the title replace the AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer and Momentum replace the AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer、AdamOptimizer and Momentum Jun 9, 2023
@paddle-ci-bot
Copy link

paddle-ci-bot bot commented Jun 28, 2023

Sorry to inform you that 26a1caf's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually.

@longranger2 longranger2 changed the title replace the AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer、AdamOptimizer and Momentum replace the AdagradOptimizer 、adamaxOptimizer、AdadeltaOptimizer、RMSPropOptimizer、LambOptimizer and Momentum Jun 29, 2023
@@ -249,7 +249,7 @@ def test_nesterov_momentum_optimizer(self):


class TestAdagradOptimizer(unittest.TestCase):
class MockAdagrad(optimizer.AdagradOptimizer):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个单测测试的均为fluid.Optimizer,实际上2.0的Optimizer已有自己的单测test_xxx_api / test_xxx_op , 这里建议对应删除对应的测试case, 而非将测旧优化器的单测改为测新优化器

下同

Copy link
Contributor Author

@longranger2 longranger2 Jul 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改👌

@@ -410,9 +410,9 @@ def dygraph_adadelta_mp(self, use_amp, mp):
paddle.set_device('gpu')
input = paddle.randn((2, 2))
model = paddle.nn.Linear(2, 2)
optimizer = paddle.fluid.optimizer.Adadelta(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

在这几个test_xxx_op.py的单测文件中,目前多数同时存在例如TestAdadeltaMultiPrecision1_0 & TestAdadeltaMultiPrecision2_0 分别对paddle.fluid.Optimizerpaddle.Optimizer进行测试。

目前的改法是将1.0版本的测试替换成2.0,和原2.0版本的测试差别仅是初始化数值上的的差异。考虑到并不能覆盖到更多场景,同时单测的类名将有一定迷惑性(命名1_0,但是测2.0的API),建议把这部分单测都删掉

@@ -408,7 +408,7 @@ def static_adagrad_mp(self, use_amp, mp):
exe = paddle.static.Executor('gpu')
train_program = paddle.static.Program()
startup_program = paddle.static.Program()
optimizer = paddle.fluid.optimizer.Adagrad(learning_rate=0.001)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test_adadelta_op.py,建议移除单测

@@ -404,8 +404,8 @@ def dygraph_adamax_mp(self, use_amp, mp):
paddle.set_device('gpu')
input = paddle.randn((2, 2))
model = paddle.nn.Linear(2, 2)
optimizer = paddle.fluid.optimizer.Adamax(
learning_rate=0.001, parameter_list=model.parameters()
optimizer = paddle.optimizer.Adamax(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test_adadelta_op.py,建议移除单测

@longranger2
Copy link
Contributor Author

已修改👌

@longranger2
Copy link
Contributor Author

longranger2 commented Jul 11, 2023

@zoooo0820 辛苦review下~

Copy link
Contributor

@zoooo0820 zoooo0820 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@jeff41404 jeff41404 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@risemeup1 risemeup1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jeff41404 jeff41404 merged commit 9436585 into PaddlePaddle:develop Jul 11, 2023
26 of 27 checks passed
cqulilujia pushed a commit to cqulilujia/Paddle that referenced this pull request Jul 24, 2023
…opOptimizer、LambOptimizer and Momentum (PaddlePaddle#54152)

* replace the AdadeltaOptimizer with Adadelta

* replace the RMSPropOptimizer with RMSProp

* replace the LambOptimizer with lamb

* replace the momentum in contrib/optimizer.py with Momentum in python/paddle/optimizer/momentum.py

* fix bug

* fix bug

* fix bug

* fix bug of Lamp

* fix bug of Lamp

* fix bug of import

* replace the AdamaxOptimizer with Admax and change the optimizer base for AdagradOptimizer

* fix bug

* fix bug

* Update optimizer.py

* fix bug

* fix bug
wz1qqx pushed a commit to wz1qqx/Paddle that referenced this pull request Jul 31, 2023
…opOptimizer、LambOptimizer and Momentum (PaddlePaddle#54152)

* replace the AdadeltaOptimizer with Adadelta

* replace the RMSPropOptimizer with RMSProp

* replace the LambOptimizer with lamb

* replace the momentum in contrib/optimizer.py with Momentum in python/paddle/optimizer/momentum.py

* fix bug

* fix bug

* fix bug

* fix bug of Lamp

* fix bug of Lamp

* fix bug of import

* replace the AdamaxOptimizer with Admax and change the optimizer base for AdagradOptimizer

* fix bug

* fix bug

* Update optimizer.py

* fix bug

* fix bug
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants