The Wayback Machine - https://web.archive.org/web/20210831015218/https://github.com/topics/glove-embeddings
Skip to content
#

glove-embeddings

Here are 222 public repositories matching this topic...

Taking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
  • Updated Oct 13, 2018
  • Jupyter Notebook

Using the IMDB data found in Keras here a few algorithms built with Keras. The source code is from Francois Chollet's book Deep learning with Python. The aim is to predict whether a review is positive or negative just by analyzing the text. Both self-created as well as pre-trained (GloVe) word embeddings are used. Finally there's a LSTM model and the accuracies of the different algorithms are compared. For the LSTM model I had to cut the data sets of 25.000 sequences by 80% to 5.000, since my laptop's CPU was not able to run the data crunching, making the model's not fully comparable.
  • Updated Sep 27, 2018
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the glove-embeddings topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the glove-embeddings topic, visit your repo's landing page and select "manage topics."

Learn more