New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
[Question] Is there any chance to use more pre-train embedding like RoBERTa or ALBERT?
question
#439
opened Dec 2, 2020 by
wuling31715
[Question] The input sequence length for Bert-BiLSTM-CRF and Save the best model
question
#436
opened Nov 29, 2020 by
SharpKoi
2 of 2
[Feature request] ner中前一次和后一次训练模型后保存model_info.json中, idx2label属性字典key前后不一致,能否修改为一致的,方便使用。
enhancement
#434
opened Nov 19, 2020 by
XuanYang1991
can i get word embedding from kashgari,like w2v[word] in gensim
enhancement
#426
opened Oct 22, 2020 by
mashagua
How should I format the dataset to pass the document or sentence information?
question
wontfix
#419
opened Sep 17, 2020 by
Syauri
[Question] could you ask me that how to design own Embedding? Is there any document?
question
#412
opened Sep 3, 2020 by
Igoslow
Different Embeddding seqenence warning when I load model
pinned
question
#324
opened Jan 14, 2020 by
daizhonghao
[Feature request] Using XLM pretrained models
enhancement
pinned
#282
opened Oct 24, 2019 by
echan00
How to train on limited RAM when I have >10million documents
pinned
question
#273
opened Oct 19, 2019 by
allhelllooz
[BUG] build_tpu_model is not supported for bi-lstm-crf
bug
pinned
#175
opened Jul 24, 2019 by
huzujun
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.

Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.
