language-model
Here are 625 public repositories matching this topic...
-
Updated
Oct 22, 2020
-
Updated
Dec 19, 2020 - Rust
chooses 15% of token
From paper, it mentioned
Instead, the training data generator chooses 15% of tokens at random, e.g., in the sentence my
dog is hairy it chooses hairy.
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.
PositionalEmbedding
-
Updated
Dec 22, 2020 - Python
-
Updated
Jul 28, 2020 - Python
-
Updated
Nov 11, 2020 - Python
-
Updated
Dec 21, 2020 - Scala
-
Updated
May 7, 2020 - Python
-
Updated
Dec 12, 2020
-
Updated
Dec 9, 2020 - Python
-
Updated
Nov 6, 2020 - Python
-
Updated
Oct 19, 2020 - Python
-
Updated
Feb 7, 2019 - Python
Is your feature request related to a problem? Please describe.
For a fast proof of concepts users need to try Haystack on their own documents. While this is possible on the code level (e.g. Tutorial 1), in many cases an additional UI would be helpful so that users can interact more easily with the model and show it to colleagues.
Describe the solution you'd like
A minimal UI built wit
-
Updated
Aug 5, 2020
-
Updated
Jan 1, 2019 - Python
-
Updated
Dec 18, 2020 - Python
-
Updated
Dec 21, 2020 - Go
-
Updated
Oct 29, 2020 - Python
-
Updated
Dec 13, 2020 - Python
-
Updated
Dec 14, 2020 - Python
-
Updated
Sep 18, 2020 - Python
-
Updated
Dec 18, 2017 - Python
-
Updated
Nov 23, 2020 - TeX
-
Updated
Nov 15, 2018 - Jupyter Notebook
-
Updated
Nov 30, 2020 - Jupyter Notebook
-
Updated
Oct 14, 2020
Improve this page
Add a description, image, and links to the language-model topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the language-model topic, visit your repo's landing page and select "manage topics."
Bart is a seq2seq model, but there might be applications where one would like to use only the pre-trained BartDecoder in an EncoderDecoder setting with a "long" encoder, such as
This is already p