entropy
Entropy is a measure of the degree of disorder or randomness in a system. It describes the amount of energy or information that is unavailable to do useful work. The higher the entropy, the greater the amount of disorder, and the lower the ability of the system to do useful work.
Requires login.
Related Concepts (14)
- black hole entropy
- boltzmann's entropy formula
- decision trees
- entropic force
- entropy in information theory
- entropy in quantum field theory
- entropy in statistical mechanics
- gibbs entropy formula
- maximum entropy principle
- shannon entropy
- statistical mechanics
- thermodynamic entropy
- thermodynamics
- transition to disorder