NLP: Word Embeddings-Word2Vec and GloVE

In NLP, understanding the intricate relationships between words and context is a challenge. By harnessing the power of word embeddings, we can unlock semantic connections and enhance NLP models. This article presents an overview of the same.

Rahul S
10 min readAug 17

--

It is advisable to read about Bag-ofWords and TF-IDF before going into Word Embeddings. here are the links:

To get an OVERALL view first, you should ALSO read this:

But reading this article as a stand alone should also be beneficial as it tries to start from the beginning.

1. CHALLENGES WITH TEXT-BASED MODELS

Suppose we are constructing a sentiment prediction model for text. For it, we typically rely on specific keywords to determine the sentiment. However, a major challenge arises when encountering new sequences that contain keywords not present in the training dataset.

For instance, consider the sequence “The picture is awesome.” While the word “awesome” indicates a positive sentiment, how does the model recognize this? How does it associate “awesome” with other similar words like “good” and “excellent”?

Text-based models face the difficulty of not automatically capturing semantic relationships between words and context. They…

--

--

Rahul S

I learn as I write | LLM, NLP, Statistics, ML

Recommended from Medium

Lists

See more recommendations