perplexity

Perplexity is a measure used in the field of natural language processing to quantify how well a probabilistic language model predicts a given sequence of words. It indicates the level of uncertainty or confusion in the model's predictions, with lower perplexity values indicating better performance and higher coherence.

Requires login.