From RNNs to GPT: A Comprehensive Guide To Recent Breakthroughs in Natural Language Processing — 1

Rahul S
15 min readMay 12, 2023

I have been more conceptual than historical. Also, I have eschewed mathematical rigor to preserve readability.

credits: https://jalammar.github.io/illustrated-gpt2/

(1) NLP BEFORE RNN

Before the advancement of Recurrent Neural Networks (RNNs) in Natural Language Processing (NLP), several techniques were used for processing and analyzing natural language data.

(1.1) Rule-Based Methods: In this method, a set of rules was defined by linguistic experts to analyze and process natural language data. The rules were based on the grammar, syntax, and semantics of the language. These rules were used to identify and extract the relevant information from the text.

For example, consider the following rule: If a sentence contains a subject, a verb, and an object, then it is a declarative sentence.

This rule can be used to identify declarative sentences in a text.

(1.2) Statistical Methods: Statistical methods were used to analyze large amounts of natural language data. These methods involved the use of probabilistic models and machine learning algorithms to identify patterns in the data. For example, the Naive Bayes algorithm was used for text classification tasks, such as spam filtering.

(1.3) Information Retrieval (IR) Methods: IR methods were used to retrieve relevant information from large collections of text documents. These methods involved indexing the documents and then matching user queries against the index. For example, search engines like Google and Bing use IR methods to retrieve relevant web pages for a given query.

(1.4) Machine Translation: Machine translation is the task of translating text from one language to another automatically. Before the advent of RNNs, statistical machine translation (SMT) was the dominant approach. SMT involved the use of statistical models to learn the translation rules between the source and target languages.

While these methods were effective for many tasks, they had some limitations, such as the inability to handle long-term dependencies in text and difficulty in capturing context and meaning. RNNs, with their ability to handle sequential data and capture context and…

--

--