Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

每填喂batch_size数据,embedding矩阵都重新初始化 #12

Open
sadxiaohu opened this issue Jul 22, 2019 · 0 comments
Open

每填喂batch_size数据,embedding矩阵都重新初始化 #12

sadxiaohu opened this issue Jul 22, 2019 · 0 comments

Comments

@sadxiaohu
Copy link

博主好,感谢提供了这一份嵌入bert预训练模型的python实现。最近有在看你的代码,发现一个事情,每次再向bert_layer层提供input_ids时,都是重新初始化了embedding_table,那么不同句中的同一个字极有可能是不同的初始化向量,这里是不是规定下种子号更为妥当?bert小白一枚,还请指点下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant