shannon entropy

Shannon entropy is a measure of the amount of uncertainty or information content in a set of data or a message, based on the probability distribution of its possible outcomes or symbols. It is named after the mathematician Claude Shannon who developed the concept as a fundamental part of information theory. The higher the entropy, the more unpredictable or complex the information is, and the more bits it takes to represent it efficiently. The entropy can be used to analyze the redundancy, compression, and security of data, as well as to quantify the randomness of a system or process.

Requires login.