Machine Learning: Cross Entropy and Cross-Entropy Loss

Rahul S
2 min readOct 2, 2023

Cross Entropy and Cross-Entropy Loss are closely related concepts, but they serve different purposes in the realm of probability theory and machine learning.

Cross Entropy:

Cross Entropy, at its core, is a measure of dissimilarity between two probability distributions. It quantifies how different one probability distribution is from another. It is expressed as:

In this expression:

  • y_i represents the true probability distribution (often one-hot encoded labels).
  • p_i represents the predicted probability distribution generated by a model.

Cross Entropy is a general concept from information theory and probability theory. It’s not exclusive to machine learning but has various applications in areas like information retrieval and statistics.

Cross Entropy captures how much information is lost when using one probability distribution (the predicted probabilities) to represent another (the true probabilities).

Cross-Entropy Loss/Log Loss:

Cross-Entropy Loss, also known as “Log Loss,” is a specific application of Cross Entropy in machine learning.

--

--

No responses yet