The Wayback Machine - https://web.archive.org/web/20200612105412/https://github.com/topics/word-embeddings
Skip to content
#

word-embeddings

Here are 518 public repositories matching this topic...

gensim
chashimo
chashimo commented Mar 17, 2020

I tried selecting hyper parameters of my model following "Tutorial 8: Model Tuning" below:
https://github.com/flairNLP/flair/blob/master/resources/docs/TUTORIAL_8_MODEL_OPTIMIZATION.md

Although I got the "param_selection.txt" file in the result directory, I am not sure how to interpret the file, i.e. which parameter combination to use. At the bottom of the "param_selection.txt" file, I found "

Luke-in-the-sky
Luke-in-the-sky commented Jun 3, 2018

Hi there,

I think there might be a mistake in the documentation. The Understanding Scaled F-Score section says

The F-Score of these two values is defined as:

$$ \mathcal{F}_\beta(\mbox{prec}, \mbox{freq}) = (1 + \beta^2) \frac{\mbox{prec} \cdot \mbox{freq}}{\beta^2 \cdot \mbox{prec} + \mbox{freq}}. $$

$\beta \in \mathcal{R}^+$ is a scaling factor where frequency is favored if $\beta

icecity96
icecity96 commented Mar 6, 2018

OS: MacOS 10.13.3
I installed the MeTA as instructed on setup guide. When I do the unit test

describe [ranker regression]
libc++abi.dylib: terminating with uncaught exception of type meta::corpus::corpus_exception: corpus configuration file (../data//cranfield/line.toml) not present
zhoulumei
zhoulumei commented Oct 22, 2019

ERROR: tensorflow 1.15.0 has requirement numpy<2.0,>=1.16.0, but you'll have numpy 1.14.6 which is incompatible.
ERROR: spacy 2.1.8 has requirement numpy>=1.15.0, but you'll have numpy 1.14.6 which is incompatible.
ERROR: mxnet-cu92 1.5.1.post0 has requirement numpy<2.0.0,>1.16.0, but you'll have numpy 1.14.6 which is incompatible.
ERROR: imgaug 0.2.9 has requirement numpy>=1.15.0, but you'll h

giacbrd
giacbrd commented Oct 17, 2016

fastText supervised model does not take into account of the document and words representation, it just learns bag of words and labels.
embeddings are computed only on the relation word->label. it would be interesting to learn jointly the semantic relation label<->document<->word<->context.
for now it is only possible to pre-train word embeddings and then use them as initial vectors for the clas

Improve this page

Add a description, image, and links to the word-embeddings topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the word-embeddings topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.