Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weights not used when initializating BertModel #118

Open
LeoDalcegio opened this issue Dec 26, 2021 · 1 comment
Open

Weights not used when initializating BertModel #118

LeoDalcegio opened this issue Dec 26, 2021 · 1 comment

Comments

@LeoDalcegio
Copy link

I am recieving the following warning, is this something normal, that should be happening or I am doing something wrong?

The version of the transformers module and other dependencies are the same as in requirements.txt.

I just want to know if I should be worried about this warning.

Some weights of the model checkpoint at bert-large-uncased were not used when initializing BertModel: ['cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.decoder.weight', 'cls.seq_relationship.bias']

  • This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).

  • This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

@koolgax99
Copy link

Hey @LeoDalcegio Is this issue solved for you? I am getting the same error.

cc @dmmiller612 kindly look into this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants