NLP: Word Embeddings-Word2Vec and GloVE
In NLP, understanding the intricate relationships between words and context is a challenge. By harnessing the power of word embeddings, we can unlock semantic connections and enhance NLP models. This article presents an overview of the same.
It is advisable to read about Bag-ofWords and TF-IDF before going into Word Embeddings. here are the links:
NLP: Explain Bag of Words.
The Bag of Words (BoW) model is a simple and was widely used technique in natural language processing (NLP) for…
NLP: What is TF-IDF?
It is advisable to go through basics og Bag of Words before delving into TF-IDF:
To get an OVERALL view first, you should ALSO read this:
NLP: Mathematizing Meaning and Context: Distributional Semantics and Contextualized Word…
This article explores the mathematization of meaning and context in NLP through distributional semantics and…
But reading this article as a stand alone should also be beneficial as it tries to start from the beginning.
1. CHALLENGES WITH TEXT-BASED MODELS
Suppose we are constructing a sentiment prediction model for text. For it, we typically rely on specific keywords to determine the sentiment. However, a major challenge arises when encountering new sequences that contain keywords not present in the training dataset.
For instance, consider the sequence “The picture is awesome.” While the word “awesome” indicates a positive sentiment, how does the model recognize this? How does it associate “awesome” with other similar words like “good” and “excellent”?
Text-based models face the difficulty of not automatically capturing semantic relationships between words and context. They…