Entropy

From SklogWiki
Revision as of 15:32, 4 January 2012 by Carl McBride (talk | contribs) (Reverted edits by 70.135.118.126 (talk) to last revision by Carl McBride)
Jump to navigation Jump to search
"Energy has to do with possibilities. Entropy has to do with the probabilities of those possibilities happening. It takes energy and performs a further epistemological step."
Constantino Tsallis [1]

Entropy was first described by Rudolf Julius Emanuel Clausius in 1865 [2]. The statistical mechanical desciption is due to Ludwig Eduard Boltzmann (Ref. ?).

Classical thermodynamics

In classical thermodynamics one has the entropy, ,

where is the heat and is the temperature.

Statistical mechanics

In statistical mechanics entropy is defined by

where is the Boltzmann constant, m is the index for the microstates, and is the probability that microstate m is occupied. In the microcanonical ensemble this gives:

where (sometimes written as ) is the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system. This equation provides a link between classical thermodynamics and statistical mechanics

Arrow of time

Articles:

Books:

  • Steven F. Savitt (Ed.) "Time's Arrows Today: Recent Physical and Philosophical Work on the Direction of Time", Cambridge University Press (1997) ISBN 0521599458
  • Michael C. Mackey "Time's Arrow: The Origins of Thermodynamic Behavior" (1992) ISBN 0486432432
  • Huw Price "Time's Arrow and Archimedes' Point New Directions for the Physics of Time" Oxford University Press (1997) ISBN 978-0-19-511798-1

See also:

References

Related reading

External links