entropy

Entropy is a measure of the degree of disorder or randomness in a system. It describes the amount of energy or information that is unavailable to do useful work. The higher the entropy, the greater the amount of disorder, and the lower the ability of the system to do useful work.

Requires login.