Hi Eric, ( fsegg_uaf_edu, _CgjNVW B Google.COM, . ),
To this comment of mine: <<
Linux, newsreaders and physics are Background topics, I say. >>
You replied: <<
Stupid relf continues to be willfully ignorant of accepted nettiquite.
Why do you continue posting IRRELEVANT BULLSHIT to sci.physics ? >>
If you'd just take the time to checkout GuruNet,
you'd see that it has plenty of physics terms in it.
But if you do try GuruNet, before Alt-Clicking a word,
be sure to highlight the word you want the definition of
( or Cntrl-Middle-Click, as I've set it up, to avoid conflicts )
...They don't tell you to do that, but it is required.
GuruNet works from any WinXP location, including Visual Studio.NET.
It's a totally free on-line dictionary/encyclopedia/language-translator,
but you can get some extra physics-specific stuff if you pay them.
Here is one of their ( free ) definitions of entropy: <<
entropy, quantity specifying the amount of disorder or randomness
in a system bearing energy or information.
Originally defined in thermodynamics in terms of heat and temperature,
entropy indicates the degree to which a given quantity of thermal energy
is available for doing useful work
the greater the entropy, the less available the energy.
For example, consider a system composed of a hot body and a cold body;
this system is ordered because the faster, more energetic molecules
of the hot body are separated from
the less energetic molecules of the cold body.
If the bodies are placed in contact,
heat will flow from the hot body to the cold one.
This heat flow can be utilized by a heat engine
( device which turns thermal energy into mechanical energy, or work ),
but once the two bodies have reached the same temperature,
no more work can be done. Furthermore,
the combined lukewarm bodies cannot unmix themselves
into hot and cold parts in order to repeat the process.
Although no energy has been lost by the heat transfer,
the energy can no longer be used to do work.
Thus the entropy of the system has increased.
According to the second law of thermodynamics,
during any process the change in entropy of a system
and its surroundings is either zero or positive.
In other words the entropy of the universe as a whole
tends toward a maximum.
This means that although energy cannot vanish
because of the law of conservation of energy ( see conservation laws ),
it tends to be degraded from useful forms to useless ones.
It should be noted that
the second law of thermodynamics is statistical rather than exact;
thus there is nothing to prevent
the faster molecules from separating from the slow ones. However,
such an occurrence is so improbable as to be
impossible from a practical point of view.
In information theory the term entropy is used to represent
the sum of the predicted values of the data in a message. >>