Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: optimizer got an empty parameter list #26

Open
Romitavia opened this issue Feb 8, 2021 · 8 comments
Open

ValueError: optimizer got an empty parameter list #26

Romitavia opened this issue Feb 8, 2021 · 8 comments

Comments

@Romitavia
Copy link

运行train.py的时候报错ValueError: optimizer got an empty parameter list,报错位置optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=cfg.LR)这个是什么问题呀

@Aruen24
Copy link

Aruen24 commented Mar 5, 2021

我也是报的这个错误,解决了吗?

@Romitavia
Copy link
Author

Romitavia commented Mar 6, 2021 via email

@kk701710
Copy link

kk701710 commented Apr 1, 2021

我也是报的这个错误,解决了吗?

兄弟问下你这个问题解决了吗???

@Romitavia
Copy link
Author

Romitavia commented Apr 1, 2021 via email

@kk701710
Copy link

kk701710 commented Apr 1, 2021

解决了的 我忘记这个报错是具体在哪一部分了 要不加qq联系

------------------ 原始邮件 ------------------ 发件人: "lxztju/pytorch_classification" @.>; 发送时间: 2021年4月1日(星期四) 中午11:04 @.>; @.@.>; 主题: Re: [lxztju/pytorch_classification] ValueError: optimizer got an empty parameter list (#26) 我也是报的这个错误,解决了吗? 兄弟问下你这个问题解决了吗??? — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

我扣扣598369667,加我以下

@Romitavia
Copy link
Author

Romitavia commented Apr 1, 2021 via email

@Romitavia
Copy link
Author

Romitavia commented Apr 1, 2021 via email

@caixh39
Copy link

caixh39 commented Apr 14, 2021

#####build the network model
if not cfg.RESUME_EPOCH:
    print('****** Training {} ****** '.format(cfg.model_name))
    print('****** loading the Imagenet pretrained weights ****** ')
    if not cfg.model_name.startswith('efficientnet'):
        model = cfg.MODEL_NAMES[cfg.model_name](num_classes=cfg.NUM_CLASSES)
        # #冻结前边一部分层不训练
        ct = 0
        for child in model.children():
            ct += 1
            # print(child)
            if ct < 8:
                print(child)
                for param in child.parameters():
                    param.requires_grad = False

in train.py file, I change the "param.requires_grad = False" to "param.requires_grad = True", and problem solved

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants