Perplexity is a measure of uncertainty or unpredictability in a probability distribution. It can be understood as the exponentiation of entropy, which measures the average amount of information needed to describe the outcome of a random variable.

In summary:

  • Perplexity = 2entropy2entropy, a measure of uncertainty.
  • Information gain = reduction in entropy (uncertainty) upon observing the variable.

perplexity pro

Referenced in:

All notes