NLP: Word Embeddings-Word2Vec and GloVE

In NLP, understanding the intricate relationships between words and context is a challenge. By harnessing the power of word embeddings, we can unlock semantic connections and enhance NLP models. This article presents an overview of the same.

Rahul S
10 min readAug 17

--

It is advisable to read about Bag-ofWords and TF-IDF before going into Word Embeddings. here are the links:

To get an OVERALL view first, you should ALSO read this:

But reading this article as a stand alone should also be beneficial as it tries to start from the beginning.

1. CHALLENGES WITH TEXT-BASED MODELS

Suppose we are constructing a sentiment prediction model for text. For it, we typically rely on specific keywords to determine the sentiment. However, a major challenge arises when encountering new sequences that contain keywords not present in the training dataset.

For instance, consider the sequence “The picture is awesome.” While the word “awesome” indicates a positive sentiment, how does the model recognize this? How does it associate “awesome” with other similar words like “good” and “excellent”?

Text-based models face the difficulty of not automatically capturing semantic relationships between words and context. They…

--

--

NLP: Named Entity Recognition

6 min read

Oct 7

NLP: Text Extraction: Summarization- Introduction, Types, Steps, and Challenges

3 min read

Oct 5

Natural Language Processing: Syntax, Semantics, and Key Techniques

2 min read

Aug 26

NLP: Mathematizing Meaning and Context in Language

5 min read

Aug 17

Simplifying Transformers: On the Power of ‘Attention’ in Natural Language Processing- AN INTUITION

5 min read

Jun 13

NLP: Bag of Words

2 min read

Jun 7

NLP: TF-IDF (Term Frequency-Inverse Document Frequency)

3 min read

Jun 7

Natural Language Processing: A Comprehensive Tutorial

30 min read

Jun 7

Rahul S

I learn as I write | LLM, NLP, Statistics, ML