Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

问一下,您的那个LSTM分词模型训练了多长时间 #1

Open
Zhangzirui opened this issue Oct 25, 2016 · 1 comment
Open

Comments

@Zhangzirui
Copy link

就是我拿您的程序用来测试了一下。我理解的程序的意思是,先把文档分成每一行的句子,再把句子用长度为5的窗口来再次进行分割。每一个窗口分割的长度为5的数组就要用LSTM模型跑一遍。而每跑一遍就需要500次epoch,每次epoch又进行20000下。我发现的跑的这也太慢了,即使是GPU跑也太慢了。

我看见您百度云里面有已经训练好的模型,我想知道这个模型训练了多长时间,还有我的理解有没有问题。希望您能帮我解惑,谢谢

@qiaofei32
Copy link
Owner

@Zhangzirui 训练了一天,理解没有问题。这个项目的思路本质上是字标注法+分类

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants