![]() | This ![]() It is of interest to the following WikiProjects: | ||||||||||
|
Hi, in continuation from kenosis' comments ( above), first I'm going to add a short referenced sentence as per kenosis’ suggestion about recent incorporation of the "energy dispersal" perspective. Second, to help diffuse this issue, e.g. noting that “order and disorder” are hot topics on the talk page archives (ex. Talk:Entropy/Archive3#Disorder) and current ones, I’m going to start a order and disorder header section (with a link to its own article: entropy (order and disorder)). Third I collected the following stats to help clarify the prevalence of terms via Google search results:
I hope this clarifies things. I suggest that we work together to build the entropy (order and disorder) article (where we put most of the chaos discussion) over the next few weeks and put a short intro paragraph with a “see main” link on the entropy page. I'll take a chunk out of the statisitical perspective section to give it a quick start and we can build on this. Is this fine with all? -- Sadi Carnot 10:32, 31 October 2006 (UTC)
In der Thermodynamik ist es wichtig, dass man Wärme und Arbeit streng unterscheidet, obwohl im 1. Hauptsatz ihre Aequivalenz festgestellt wird. Diese Unterscheidung ist nur möglich, wenn man sich auf die Existenz adiabatischer Wände beruft. Diese sollen zwar selber keine thermodynamischen Eigenschaften haben, aber dennoch den Ausgleich von Temperaturdifferenzen verhindern. Aehnliche Fiktionen (z.B. ideale Antikatalysatoren) werden wir später benötigen. Für deren Existenz kann man sich nicht wirklich auf die Erfahrung berufen, und deshalb hat diese Pauli mit Recht "Zaubermittel" genannt. Erst in der statistischen Mechanik hat man diese Zaubermittel nicht mehr nötig. In dieser ist die Wärme derjenige Anteil der Energie, der den makroskopisch nicht beobachteten Freiheitsgraden zugeschrieben werden muss. Was aber makroskopisch beobachtet wird, hängt wesentlich von den Beobachtungsmöglichkeiten ab. Darum ist die Wärme, und damit die Entropie (s.u.), in der statistischen Mechanik strenggenommen immer nur relativ zu einem makroskopischen Beobachter definiert.
Die phänomenologische Thermodynamik wird von Dozenten und Studenten mit Recht immer wieder als eine Disziplin empfunden, die zwar mathematisch anspruchslos, aber begrifflich doch recht subtil ist. Hinzu kommt, dass diese Theorie anderen Gebieten der theoretischen Physik in ihrem ganzen Aufbau fremdartig gegenüber steht. Für die meisten Studierenden ist es vernünftig, nicht zuviel Zeit in das Studium der Thermodynamik zu investieren und sich möglichst rasch der Statistischen Mechanik zuzuwenden.
Hi Pjacobi, my German is in need of improvement (I hope to be semi-fluent before I go to Octoberfest some day). Here's one of my favorite quotes, from the chemical thermodynamics page, which exemplifies my perspective: In the preface section to popular book Basic Chemical Thermodynamics by physical chemist Brian Smith, originally published in 1973, and now in the 5th edition, we find the following overview of the subject as it is perceived in college: [1]
“ | The first time I heard about chemical thermodynamics was when a second-year undergraduate brought me the news early in my freshman year. He told me a spine-chilling story of endless lectures with almost three-hundred numbered equations, all of which, it appeared, had to be committed to memory and reproduced in exactly the same form in subsequent examinations. Not only did these equations contain all the normal algebraic symbols but in addition they were liberally sprinkled with stars, daggers, and circles so as to stretch even the most powerful of minds. | ” |
Also, I gave the new header section to "order and disorder" a good start (and added seven new sources and uploaded two images); we can all now build it, e.g. I'm pretty sure that
Hermann von Helmholtz and possibly others had some ideas on order and disorder, then later when it gets too big paste most of it to its own page. Out of time for today. Happy
Halloween -
! --
Sadi Carnot
13:57, 31 October 2006 (UTC)
References
{{
cite book}}
: CS1 maint: multiple names: authors list (
link)
I've revised this section a bit to clarify the process. However, the bit about the second law seems a bit dodgy as magnetic energy clearly goes through the insulated chamber – perhaps you could reconsider that bit? .. dave souza, talk 11:16, 17 November 2006 (UTC)
Although it wouldn't have the practical application of ultra-cooling, the relevance to entropy might be more clear if magneto-caloric effects were discussed in an isothermal context (temperature held constant) rather than an adiabatic context (insulated - no heat flow). In an isothermal context, heat will flow into or out of the sample to maintain the temperature as the magnetic field is changed. This heat flow can be related directly to the definition of entropy (heat flow divided by temperature). Compbiowes 22:35, 10 January 2007 (UTC)
This section is giving too much prominence to the hobbyhorse of a few chemistry teachers. This Frank L. Lambert, in particular, is just someone with a bee in his bonnet. The entire section should just be removed. 98.109.238.95 ( talk) 19:32, 24 June 2013 (UTC)
Cf. /info/en/?search=Talk:Entropy/Archive4 98.109.238.95 ( talk) 20:59, 24 June 2013 (UTC)
I would just like to know what proportion of standard textbooks published by reputable publishers (not alphascript, the only one that shows up on page one of google) use this definition. For various reasons, I was recently looking at Fermi, Pauli, Fowler, Jeans, Planck, von Plato, and Ford, and not one uses it. Neither does Thermodynamics Demystified, written by a chemist. NPOV requires that a point of view be given its due weight. If the proportion I was asking for is very small, this hobbyhorse should not be included at all. 98.109.238.95 ( talk) 22:34, 24 June 2013 (UTC)
This entire Wikipedia page seems to be devoted to a controversy rather than scientific exposition. Perhaps it should be retitled to reflect that? Or -- since it is pretty muddy -- just thrown away?
In the introductory paragraphs:
1) ...entropy is commonly associated with the amount of order, disorder, or chaos...
Not what it is, but what people think about it.
2) This stems from Rudolf Clausius' 1862 assertion...
For the next paragraph, we fly through through a historical development rather than giving a definition.
3) In the 2002 encyclopedia Encarta, for example, entropy is defined as a thermodynamic property which serves as a measure of how close a system is to equilibrium, as well as a measure of the disorder in the system.
This is not a definition either.
4) In the context of entropy, "perfect internal disorder" is synonymous with "equilibrium", but since that definition is so far different from the usual definition implied in normal speech, the use of the term in science has caused a great deal of confusion and misunderstanding.
What definition? Do you mean the "synonymy" of perfect internal disorder with equilibrium? That is not a definition in a scientific sense; I cannot even tell which term is intended here to be the definiendum.
5) Locally, the entropy can be lowered by external action. This applies to machines, such as a refrigerator, where the entropy in the cold chamber is being reduced, and to living organisms. This local decrease in entropy is, however, only possible at the expense of an entropy increase in the surroundings.
This instructive commentary seems entirely out of place in the introduction. You have not even told the reader that entropy increases, and in any case, what does this have to do with entropy as disorder? Something, I'm sure, but it hasn't been said here.
The whole article continues in this way. Spongy.
The article may have one positive function: to serve as a dump site for these kinds of controversies while other articles get freed of them. 84.227.245.245 ( talk) 10:51, 15 September 2014 (UTC)
The overview states If you have more than one particle, or define states as being further locational subdivisions of the box, the entropy is lower because the number of states is greater. Surely that lower should be higher (and maybe it should say microstates)? Terrycojones ( talk) 13:16, 14 February 2016 (UTC)
Part of the confusion regarding entropy and disorder is that we frequently only consider structural or spatial ordering (phase space) when analyzing the entropy of a system. A good example of where this can lead to mistakes is the entropy of helium-3 in its liquid and solid phases. If one only considers the spatial degrees of freedom of a quantity of helium-3, one would be led to believe that the liquid phase was more disordered (greater entropy) than the solid phase, since the size of the spatial phase space of the liquid phase is greater than that of the solid phase.
But that is not always the case. Solid helium-3 can have greater entropy than liquid helium-3 due to the greater spin disorder (greater size of the spin phase space) of solid helium-3 than liquid helium-3. In other words, when you consider both the degree of disorder of the spins and the degree of disorder of the positions, the solid phase has more overall disorder than the liquid phase, even though the former has less positional/spatial disorder than the latter.
For a good discussion of this phenomenon see https://physics.stackexchange.com/questions/410550/why-is-the-latent-heat-of-fusion-of-helium-3-a-negative-quantity . For the source of the quote (made in a discussion of the same phenomenon), see https://books.google.com/books/about/First_Order_Phase_Transitions_of_Magneti.html?id=rgZADwAAQBAJ . Here is context of the quote:
Entropy has contributions from all phase space, and disorder is not solely about structural disorder. All liquids are more [structurally NLG] disordered than their corresponding solid phases in terms of [the the greater volume of their positional/spatial phase space determined by NLG] their positions in real space, and this governs the common observation that solids melt on heating. Nevertheless, liquid helium-3 is a quantum Fermi liquid and is highly ordered in momentum [phase NLG] space [which is independent of positional/spatial phase space NLG]. The entropy of the liquid [due to momentum phase space NLG] varies approximately linearly with T as [(T/Tf*)ln2], where Tf* is the effective Fermi temperature. On the other hand, the spins in solid helium-3 are not aligned until it is cooled to about 1 mK, and its [momentum space NLG] entropy is approximately constant at Rln2 [where R is the gas constant == Boltzmann's constant per mole NLG] above about 10 mK.
...
The statement "increasing the temperature causes a transition to the phase with higher entropy" has no known exception.
...
Entropy depends on the total disorder and not just on structural disorder.
I think this example would help illuminate the critical point that there are different kinds of disorder/degrees of freedom/phase spaces, and all of them must be analyzed when measuring entropy. -- Nick ( talk) 17:44, 4 September 2019 (UTC)