Entropy, Cross Entropy and KL Divergence are common used algorithms in deep learning. What is relation among them? In this tutorial, we will discuss the relation of them to help you to understand easily.
Entropy:
Note: H(X) ≥0
Cross Entropy:
Note: H(p,q)≥0
Note: DKL(P||Q) ≥0
The relation of them