natural-language-processing
Natural language processing (NLP) is a field of computer science that studies how computers and humans interact. In the 1950s, Alan Turing published an article that proposed a measure of intelligence, now called the Turing test. More modern techniques, such as deep learning, have produced results in the fields of language modeling, parsing, and natural-language tasks.
Here are 6,840 public repositories matching this topic...
-
Updated
Dec 12, 2020 - Python
-
Updated
Nov 16, 2020 - Jupyter Notebook
-
Updated
Dec 17, 2020 - Python
-
Updated
Dec 16, 2020 - Python
-
Updated
Dec 19, 2020 - Python
-
Updated
Dec 14, 2020 - Python
-
Updated
Jun 12, 2017
Change tensor.data
to tensor.detach()
due to
pytorch/pytorch#6990 (comment)
tensor.detach()
is more robust than tensor.data
.
Not a high-priority at all, but it'd be more sensible for such a tutorial/testing utility corpus to be implemented elsewhere - maybe under /test/
or some other data- or doc- related module – rather than in gensim.models.word2vec
.
Originally posted by @gojomo in RaRe-Technologies/gensim#2939 (comment)
-
Updated
Dec 8, 2020
-
Updated
Dec 7, 2020
-
Updated
Dec 18, 2020 - Python
-
Updated
Jun 3, 2020 - Python
-
Updated
Dec 19, 2020 - Python
more details at: allenai/allennlp#2264 (comment)
-
Updated
Dec 18, 2020 - Python
-
Updated
Nov 8, 2020
-
Updated
Oct 20, 2020 - Python
-
Updated
Dec 19, 2020 - Python
-
Updated
Oct 20, 2020 - Jupyter Notebook
-
Updated
Dec 3, 2020 - Python
-
Updated
Dec 16, 2020 - Java
-
Updated
Dec 18, 2020 - Python
Hello spoooopyyy hackers
This is a Hacktoberfest only issue!
This is also data-sciency!
The Problem
Our English dictionary contains words that aren't English, and does not contain common English words.
Examples of non-common words in the dictionary:
"hlithskjalf",
"hlorrithi",
"hlqn",
"hm",
"hny",
"ho",
"hoactzin",
"hoactzine
-
Updated
Dec 19, 2020 - Python
-
Updated
Nov 28, 2020 - Python
-
Updated
Dec 19, 2020 - Python
-
Updated
Oct 8, 2020 - Python
Created by Alan Turing
- Wikipedia
- Wikipedia
Bart is a seq2seq model, but there might be applications where one would like to use only the pre-trained BartDecoder in an EncoderDecoder setting with a "long" encoder, such as
This is already p