Entropy is a concept applied across physics, information theory, mathematics and other branches of science and engineering. The following definition is shared across all these fields:
where S is the conventional symbol for entropy. The sum runs over all microstates consistent with the given macrostate and is the probability of the ith microstate. The constant of proportionality k depends on what units are chosen to measure S. When SI units are chosen, we have k = kB = Boltzmann's constant = 1.38066×10−23 J K−1. If units of bits are chosen, then k = 1/ln(2) so that .
Entropy is central to the second law of thermodynamics. The second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work.
The second law can also be used to predict whether a physical process will proceed spontaneously. Spontaneous changes in isolated systems occur with an increase in entropy.
The word "entropy" is derived from the Greek εντροπία "a turning towards" (εν- "in" + τροπή "a turning").
Subscribe to rss feed