Jump to content

Talk:Entropy/Archive 13

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 10Archive 11Archive 12Archive 13Archive 14

deeply muddled

The lead of this article is deeply muddled in concept. It needs radical revision. A word or two here or there are as drops in the ocean.Chjoaygame (talk) 16:19, 26 June 2014 (UTC)

Agreed. W. P. Uzer (talk) 17:12, 26 June 2014 (UTC)

How about..

In thermodynamics, entropy (usual symbol S) is a measure of the disorder of an identified system of particles with a given total energy. It directly relates to the probability the system is in a particular state compared with all possible states or arrangements. According to the second law of thermodynamics the entropy of real isolated systems always increases; such systems spontaneously decay towards thermodynamic equilibrium, the configuration with maximum entropy or disorder. The result of increasing entropy is to dissipate energy evenly throughout the system so pressure, temperature, chemical potential, mechanical energy (motion) etc get smoothed out. The total energy of the system is not changed but the quality or ability of the system to do work is reduced ultimately to zero at equilibrium. Theoretical ideal systems with no energy conversion to heat due say to friction maintain constant entropy. Systems which are not isolated may decrease in entropy when subject to an input which has the effect of creating order, like the freezer box in a refrigerator. Since entropy is a state function, the change in the entropy of a system is the same for any process going from a given initial state to a given final state, whether the process is reversible (ideal) or irreversible (real). However irreversible processes increase the combined entropy of the system and its environment.

Gyroman (talk) 05:07, 28 June 2014 (UTC)

This is just as deeply muddled.Chjoaygame (talk) 08:45, 28 June 2014 (UTC)
I stated what is confusing and misleading about the use of "evolve", the only answer given was that certain authors use it that way. I stated why it is confusing and misleading and the same applies to those authors.. Well now its your turn.. OK, tell us all.. exactly what is muddled, incorrect or missing from the above proposal? Gyroman (talk) 01:58, 1 July 2014 (UTC)
Thank you for this kind invitation. Sad to say, I have to decline it.Chjoaygame (talk) 06:07, 1 July 2014 (UTC)
For me, I think there needs to be more considered explanation of what entropy actually is, before talking in detail about the second law of thermodynamics and so on. W. P. Uzer (talk) 07:32, 1 July 2014 (UTC)
Its interesting the same comment (re entropy.. What is it?) appears in the talk page to the Introduction to Entropy article. You'll know most of this but please bear with me.. This is what I think is the source of the confusion..
(1) There are two equations for entropy, the first discovered by Rudolf Clausius mid 19th century is actually the change in entropy of a SYSTEM, calculated as the integral of heat flow divided by the absolute temperature at the boundary of a subsystem to or from its surroundings both contained within the whole SYSTEM. What is not so apparent is that differences in macroscopic properties such as temperature or pressure between subsystems within a larger system represent an ordered state as opposed to a disordered state where the energy is smoothly distributed = equilibrium. The point is that only an ordered system with differences in pressure or temperature between its subsystems allows us to get any work from the system. As you know.. the amount of work can be calculated from a temperature entropy diagram as an area ∑(∆T x ∆s) for a process.. like for a steam engine.
(2) The second equation by Ludwig Boltzmann discovered around the turn of the 20th century calculates the absolute entropy of a SYSTEM and this is the general or fundamental equation for entropy. It directly measures how disordered the system is. The equation s = k.ln W should be as well known as E = M.C^2.. this is a great failing in science education. The Clausius equation is easily derived from the Boltzmann equation by making just one simplifying assumption (ie the system is already at equilibrium or W = Wmax),. this derivation must never be done the other way round! The reason is calculation of entropy change by heat transfer across a subsystem boundary is NOT a complete statement of the entropy of a system. This is clear from the Boltzmann equation, "W" is just a number and "k" is the Boltzmann constant, which is required to give "s" some units. We MUST note "order" includes logicalstates (not all of which can be accounted for by heat transfer), as well as physical states. Now "W" is the number of microstates (logical arrangements, or 'phase space' if every particle is identified and each is exchanged with every other) in a given macrostate (physical arrangement set of particle properties available). Clearly 'W' is incalculable for any real system.. just one litre of gas gives a logical phase space of microstates exceeding the size of the universe!
(3) When the absolute entropy of a system and the term ORDER, are inadequately defined or left out you end up with a gaping hole in your definition of the second law. Just saying 'Entropy Increases" or 'never decreases' (needs more definitions 'real', 'ideal', 'reversible', 'irreversible') makes little sense if entropy has not been comprehensively defined. Entropy is not about heat or energy its all about ORDER. 'W' is a probability term and ordered means less probable. Energy dissipation is a RESULT of the decay in ORDER or increase in PROBABILITY and with this understanding a clearer definition of the second law results.. "Isolated systems tend to move to their most probable state".
The emphasis on Rudolf Clausius to the exclusion of Luwig Boltzmann is at the root of the problem Gyroman (talk) 14:31, 1 July 2014 (UTC)
This is a good point. Perhaps the article would be clearer if it were re-organized into two distinct parts: (1) Entropy change, including Clausius and the Second Law, and (2) Absolute entropy, including Boltzmann, the Third Law, information theory, etc. Dirac66 (talk) 15:46, 1 July 2014 (UTC)
Agreed, that the difference between relative entropy as defined by Clausius and absolute entropy as fixed by the third law is important.
But I think it important also to distinguish explicitly also between macroscopic entropy as defined by Clausius and the third law, and microscopic or statistical entropy as defined by probability and information theory. I think it unwise to try to elide this distinction, because such an elision hides from the student an important aspect of the logical structure of the situation. The distinction having been made, it is then in order to show the linkage.
I do not agree that the word "order" should be enshrined as the key to entropy. At least one expert tackles this oft used word in this context, Walter T. Grandy, in Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, 2008, ISBN 978-0-19-954617-6, p. 56. It is ok to talk about order, but not to enshrine it.
I think that the here-relevant characteristic feature of thermodynamic equilibrium is its time invariance. It is true that Planck considers homogeneity to be characteristic. But I would say that such homogeneity should be time-invariant; I think Planck assumes this without saying so. A fluctuation from a non-equilibrium state could lead to instantaneous homogeneity. Repeated measurements on a state of thermodynamic equilibrium may show fluctuations, but they must be statistically time-invariant for the equilibrium state.Chjoaygame (talk) 18:07, 1 July 2014 (UTC)
So no comment from anyone else.. all 460+ watchers! Your all happy to have a "deeply muddled" degraded presentation..? I am still going through the references.. however I do not find any attempt to explain the intended meaning of "evolve" concerning isolated thermodynamic systems. Rather it seems to me it is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself). Clearly there is a significant group of scientists fighting the claims of another group of scientists as to the real implications of the second law for evolution. This is the sort of thing which brings professional 'bias' into science. Please STOP there, its NOT RELEVANT here.
What is relevant is that in the context of an encyclopedia there is no justification for using any word contrary to its dictionary definition and such hidden' meanings will confuse and mislead. That is what needs addressing and it is singularly unsatisfactory to claim certain scientists 'express' it that way.. no its bias! All this over a single word.. it is obviously of much higher importance to you editors than you are admitting in this discussion. Sorry but that leads me to conclude you are more part of the problem than the solution. So I think we are heading for some sort of independent resolution and I am left with no alternative but to change it and force the issue.
Done Gyroman (talk) 00:49, 23 July 2014 (UTC)
Dear Gyroman, sad to say you are mistaken. You tell us that you think that the use of the word 'evolve' "is being used as a tactical ploy to drag the second law into the controversy over the meaning of the word "evolution" (ie 'any change in gene frequency' as opposed to an 'average increase in complexity of genes' and further still into the mire we have the complete absence of the meaning of 'complexity' itself)." No, the use of the word 'evolve' here is reflective of common usage in this area of science. That's all.
Witness the title of a recent monograph on the subject: Entropy and the Time Evolution of Macroscopic Systems, Grandy, W.T, Jr (2008), Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6. Other examples from a standard text on this very matter: "We take a metal rod prepared at time zero in such a way that its temperature varies linearly from one end to the other. We then let it evolve freely and measure the change of temperature as a function of time at a given point. It is a fact of experience that, if the experiment is repeated a large number of times under identical conditions, its result is reproducible." Also "Given a system, described at time t = 0 by an arbitrary ensemble distribution, it is possible to explain its subsequent time evolution by the exact laws of classical or quantum mechanics." Many more examples in this book.<Balescu, R. (1975), Equilibrium and Nonequilibrium Statistical Mechanics, John Wiley & Sons, New York, ISBN 0-471-04600-0, p. 371.> Editor Waleswatcher has above here supplied other examples.
While you are right in general too look to ordinary language as the first candidate for Wikipedia wording, in this particular case the local language is established. Moreover, the dictionary that you choose as criterion for ordinary language is only one of many. I take here the liberty of copying one of the eight meanings listed in the Oxford English Dictionary: "6. Of circumstances, conditions, or processes: To give rise to, produce by way of natural consequence."<Oxford English Dictionary, Oxford English Dictionary Second Edition on CD-ROM (v. 4.0.0.3) © Oxford University Press 2009>. Best to choose a good dictionary when you can.
You point to my comment that the article is deeply muddled. Better you study some thermodynamics texts to try to work out what is wrong with it than try to impose your choice of words based on one popularist author. The word 'decay' that you prefer also has its local meanings. They sometimes fit with the time course of natural processes, but not always. Greatly respect Atkins though we may, his book is not a reliable source for the present purpose. Judging from the one source you cite, you have not done nearly enough reading in this area to provide you with the background to identify a reliable source.Chjoaygame (talk) 02:16, 23 July 2014 (UTC)

The OED gives the Latin verb evolvere, to roll out or unroll, as the origin of the English verb "evolve", which was in use in the 17th century, long before anyone could conceive of species evolving. "Evolve" feels more neutral than "decay", which connotes a loss, such as of structural integrity (decay of a neighborhood, tooth, etc.) or of substance (decay of a particle, signal, etc.). I suppose one could argue that two systems that have drifted into a common thermal equilibrium have lost their heterogeneity, but to say that a previously segregated neighborhood that had drifted into homogeneity had "decayed" would seem like an uncalled-for value judgment.

Could common ground be found in the verbiage "drift into thermal equilibrium"?

As for explaining this drift tendency in the lead, any such explanation would open a can of worms unsuited to a lead and more appropriate for the body. A link should be ok though, say to the principle of maximum entropy or maximum entropy thermodynamics. Vaughan Pratt (talk) 16:44, 31 October 2014 (UTC)

How can entropy change if it is held constant?

According to the article, "in theory, the entropy of a system can be changed without changing its energy. That is done by keeping all variables constant, including temperature (isothermally) and entropy (adiabatically). That is easy to see, ..."

I'm finding it hard to see. Could someone kindly explain how entropy can be changed by keeping it constant? Vaughan Pratt (talk) 17:13, 31 October 2014 (UTC)

No worries, mate. This is easily explained by the law of the purple cow, that black is white. It seems I made a mistake above, when I limited my remark to the lead. I think the muddle is not limited to the lead.Chjoaygame (talk) 18:31, 31 October 2014 (UTC)
Progressing beyond the foregoing. I have deleted the section. As it stood, as pointed out by User:Vaughan Pratt, it verged on nonsense. If someone sees merit in the section, perhaps they will restore it in a form that it makes good sense. There remains much in this article that also verges on nonsense.Chjoaygame (talk) 05:09, 1 November 2014 (UTC)

Yet more questionable material

Thanks, Chjoaygame. In the meantime the article seems to be rapidly filling up with yet more strange ideas. Examples:

"The early classical definition of the properties of the system assumed equilibrium."

How can the entropy of a system that is in equilibrium be anything but constant? This would make dS = dQ/T a vacuous equation since both sides would be zero.

"If heat is transferred out the sign would be reversed giving a decrease of entropy of the system."

Surely the second law of thermodynamics makes this impossible.

"The statistical mechanics description of the behavior of a system is necessary as the definition of the properties of a system using classical thermodynamics become an increasingly unreliable method of predicting the final state of a system that is subject to some process. "

What "increasing unreliability"? Increasing how? If the concern is with the temperature of the ground state of an atom (nanokelvins) or the derivation of Planck's law or the various distributions then say so. Otherwise this smacks of unsourced stream-of-consciousness writing.

I don't work on this article or I'd straighten this out myself. Hopefully one of the article's regular maintainers will do so. Chjoaygame @Chjoaygame: or PAR @PAR:, what do you think? Vaughan Pratt (talk) 08:26, 10 November 2014 (UTC)

Thank you for this ping. Like you, I don't work on this article, and am not a maintainer of it. I just loosely glance at it.Chjoaygame (talk) 11:20, 10 November 2014 (UTC)


"If heat is transferred out the sign would be reversed giving a decrease of entropy of the system."
'Surely the second law of thermodynamics makes this impossible.'
Actually no, it doesn't violate the second law. The sign convention is: entropy into the system is positive and entropy out is negative. The entropy of a system can be decreased provided it transfers at least as much entropy to another system. On the largest scale the entropy of the universe always increases. A system that converts heat into entropy takes in heat at a high temperature and converts some of that heat into work (zero entropy out as work) but it must dump the remaining heat to a low temperature source. As the entropy is dQ/T the entropy out is greater than the entropy into the system (note the temperature in the denominator). I must admit, that with this article there is a lot of questionable writing. I will keep reading and writing but it will take years to get to the point that I am able to do the subject justice. Zedshort (talk) 01:29, 21 November 2014 (UTC)

several articles with 'entropy' in their titles

There are quite a few Wikipedia articles that include the word entropy in their titles. We might even feel we might be in tiger country. We are not the first to feel that. There is a rather large disambiguation article for the word.

The present article has a flag "This article is about entropy in thermodynamics." The first words of the article are "In thermodynamics, entropy ..." There are also articles entitled Entropy (classical thermodynamics) and Entropy (statistical thermodynamics). And there are others. It is relevant to observe here that part of the muddle of the present article arises from failure to adequately recognize and understand the distinction between thermodynamics and statistical mechanics, as indicated above by PAR.

As for sources, they should be properly reported, with due regard for their various points of view. I think the problem is more than 'which is the right point of view?'. Perhaps I can offer an initial stab at posing our problem: 'how should the several various points of view be identified and distributed into the several articles?'. Perhaps there is a better way to formulate the problem?Chjoaygame (talk) 21:58, 23 November 2014 (UTC)

It seems to me that this article should not claim to be about entropy "in thermodynamics". The article entitled simply "Entropy" (this one) should be about the concept generally. Ideally there should not need to be a separate article called "Introduction to entropy", since the first purpose of this article is to introduce entropy. W. P. Uzer (talk) 08:30, 24 November 2014 (UTC)
I agree with that, and it means this article should be substantially reworked. Entropy is a very general concept in that it is a fundamental idea in both physics and information theory, and the article should clearly reflect that. One approach would be to make this article much shorter and more general, which could eliminate the need for a "introduction to entropy" article, and put most of the specifics as applied to (say) information theory in a separate article. As for thermodynamics versus statistical mechanics, the stat mech approach is more fundamental and more modern, and should be treated as underlying and accounting for thermodynamic entropy. In other words, entropy is no longer a concept restricted to thermodynamics, and in fact the thermodynamic meaning should be regarded as secondary and more specific, not primary. Waleswatcher (talk) 15:56, 25 November 2014 (UTC)

I can not agree with the proposal to eliminate the "introduction to entropy" article, although that is a matter to discuss over on the talk page of that article. We have plenty of "introduction to .." articles, and if any topic needs one it is entropy. Entropy is needed by students in many disciplines. I particularly have had experience teaching it to chemists. They find it very difficult and the more the explanation moves over to information entropy or even statistical entropy, the more they find it difficult. The problem with many of the articles on entropy topics is that they can only be understood by the people who wrote them. --Bduke (Discussion) 19:57, 25 November 2014 (UTC)

I don't so much envisage "eliminating" the introduction, but rather have the introduction simply titled "Entropy", written in such a way that it doesn't require a deep background in any particular discipline to understand it. But if it turns out that an even simpler introduction is needed for people with virtually no math knowledge at all, say, then I don't necessarily completely object to its existence (though I'm not sure that the current "introduction" page does the job particularly well either). W. P. Uzer (talk) 11:21, 26 November 2014 (UTC)

"where T is the absolute temperature of the system the system dividing an incremental reversible transfer of heat into that system (dQ)"

I don't understand what this means, but I don't even know enough to know what (if anything) is wrong with - Rumsfeld. — Preceding unsigned comment added by 82.35.30.227 (talk) 17:06, 2 December 2014 (UTC)

undid mistaken good faith edit

I undid some edits that changed 'chemical engineering' to 'non-equilibrium statistical mechanics'.

The article was talking about chemical engineering, not non-equilibrium statistical mechanics.Chjoaygame (talk) 14:59, 18 January 2015 (UTC)

definitions of entropy

I haven't been too active lately, but reading some of the recent additions, I really have to take issue with the paragraph beginning "There are two related definitions of entropy:...". There is essentially only one definition of thermodynamic entropy, the "classical" or "macroscopic" definition, which makes no mention of molecules, atoms, etc. The statistical mechanics "definition" of thermodynamic entropy is not a definition, it is an explanation of thermodynamic entropy. Its value lies in the fact that it is an excellent explanation. For example, you cannot define temperature in terms of some average energy per particle (per degree of freedom, actually) unless you measure the individual particle energies and calculate an average. I suppose this is possible in some experiments, but it is not generally the case. This statistical mechanical description of temperature thus serves only as an explanation of temperature, not a definition. The same general idea is true of any thermodynamic parameter, including entropy. Physical science is a science of measurement and "definitions" must be traceable back to raw measurements, not to some picture or model in your mind, no matter how repeatedly and accurately that picture or model explains your measurements. PAR (talk) 15:08, 21 November 2014 (UTC)

I would like to express my agreement with the just-above comment of PAR.Chjoaygame (talk) 20:24, 21 November 2014 (UTC)
As a non-scientist, I don't find that argument at all convincing (surely you need a picture or model of some sort to persuade you that "measurement" is happening at all? and anyway, why on earth not base a definition on a well-accepted model?), but what matters is not so much what we say here, but what respected scientists in the field say - do they regard the statistical description as a definition, or just an explanation? And what about other kinds of entropy than the thermodynamic kind? For me, this article is all still rather confused, and the lead section is possibly one of the worst of all - it's all over the place, without ever explaining what the first sentence is supposed to mean. W. P. Uzer (talk) 20:42, 21 November 2014 (UTC)
My reading of respected scientists supports the view posted by PAR. Apart from this, I can agree with W. P. Uzer that the article is confused. I think the comment of PAR offers a constructive contribution to sorting out the confusion.Chjoaygame (talk) 21:13, 21 November 2014 (UTC)
I don't like to quibble with those who likely know more of the matter than I do, but "statistical definition of entropy" gives no shortage of hits on Google Scholar, including several similar as this from Physical Review Letters: "...just as the statistical definition of entropy is widely considered more general and fundamental than the original thermodynamic definition..." W. P. Uzer (talk) 21:52, 21 November 2014 (UTC)
I see your comment as reasonable, not as a quibble. Still I prefer the view of PAR.
I think the key here is that this article announces that it is is about entropy in thermodynamics.
To answer your argument about a model. Yes, some sort of model is more or less a presupposition of a measurement. But PAR is pointing out that for measuring entropy, the model is most often, or by default, the macroscopic thermodynamic one, which is very well accepted. Indeed it would be hard to think of another way of actually making a measurement. The statistical mechanical model does not readily lend itself to direct measurement. The macroscopic thermodynamic way is really based on a model, with such concepts as thermodynamic equilibrium imagined as experimentally realized.
Part of the problem is that the word entropy was more or less stolen from its original thermodynamic definition by Shannon, for his statistical purpose, at the suggestion of von Neumann, partly on the ground that people often feel baffled by it, as you may read at History of entropy#Information theory. Not an auspicious move, I would say. Another part of the problem is that many people feel that explanation is more "fundamental" than is statement of fact. And that such greater "fundamentality" is better. You have quoted such a person. Part of the problem with the statistical definitions is that they are several and divers.
At the risk of provoking an indecorous range of responses, may I suggest for the present purpose, mainly for this article, that we might try to work out some agreement along the following lines? The primary or default meaning of the word entropy is the macroscopic thermodynamic one indicated by PAR. When the context does not more or less enforce other meanings, the plain word entropy is by default to be taken to have this primary meaning. Again, when the context does not more or less enforce other meanings, they are preferably primarily to be distinguished and explicitly indicated by up-front use of qualifying or modifying words; for example we would say "Shannon entropy" when that is what we mean. It would be reasonable, in this article, that a section about Shannon entropy could announce that within such section, for the sake of brevity, the plain word entropy is used to mean Shannon entropy unless otherwise explicitly indicated; yes, just within such section. That is not to try to say what other articles should do.
Perhaps that is putting my neck out. But at least it is more or less rational or systematically executable, considering that there is currently confusion in the absence of some such agreement.Chjoaygame (talk) 05:42, 22 November 2014 (UTC)
To Chjoaygame - I think that is a good suggestion, announce at the beginning of the article that the subject is thermodynamic entropy, and note that the statistical explanation constitutes a definition of a more general entropy, e.g. Shannon entropy.
To W. P. Uzer - Sure, you can have multiple definitions of entropy, just as you can have multiple definitions of a straight line in geometry. As long as the definitions have been proven by somebody somewhere to be consistent, and you accept that proof on faith, that is, as long as you don't want to worry yourself about the axiomatic, logical development of geometry, you can "do" geometry without running into problems. Same with thermodynamics, if you don't want to bother with the axiomatic, logical structure of thermodynamics, one, two, five definitions of entropy are fine, as long as somebody, somewhere has shown them to be equivalent, and you accept their work on faith. You can "do" thermodynamics just fine. I am interested in understanding thermodynamics, rather than just "doing" it, and if you come at it from that angle, multiple definitions don't cut it, just as multiple definitions of a straight line don't cut it in axiomatic geometry. However, it seems to me that there is no need to mislead someone who wishes to study geometry by declaring that there are multiple definitions of a straight line. It doesn't hurt to simply declare that there is one definition, from which the others can be shown to follow, and then you won't have to unlearn what you have learned if you ever wish to bother yourself with an axiomatic development. The situation is similar, but much more pronounced in quantum mechanics. If you ever want to understand quantum mechanics rather than just "do" it, the distinction between measurement and the picture you have in your mind to explain those measurements must be kept brilliantly clear. Just because Phys Rev and Google make statements about two definitions of entropy does not constitute a mathematical proof that this is the way to go. When Phys Rev states that the stat mech definition of entropy is superior, I take that to really mean that it suggests definitions which can be measurement-based that are outside the scope of classical (pseudo-equilibrium) thermodynamics. If you wish to accept pronouncements by Phys Rev as ultimate truth, you will probably be ok as long as you don't become interested in the logical structure of thermodynamics. If you don't wish to accept their statements, but instead try to understand what is going on, then the mere presence of multiple definitions, some of which are unmeasurable, has to bother you. Again, it doesn't hurt to state that there is the phenomenological definition of thermodynamic entropy, and the statistical mechanical explanation, an explanation which opens an understanding of situations beyond classical thermodynamics, an understanding which can be used to develop an axiomatic, measurement-based of these situations.
I had the both fortunate and unfortunate experience of being taught thermodynamics and statistical mechanics by Dr. Ta-You Wu. I say unfortunate because he was very strict about the theoretical foundations which I could not grasp immediately, and it took me years to finally appreciate the truth of what he tried to beat into my relatively thick head. PAR (talk) 14:14, 22 November 2014 (UTC)
Maybe it would be more profitable if we simply got to work on the article than to continue this at the moment rather abstract discussion. I don't think you can object, though, if someone writes that there are two or several definitions, when reliable sources show us that this is indeed the case. A Wikipedia article is a bit different than a lecture course - we have to try to reflect what the generality of the sources have to say on the subject, rather than just pick the one approach that we personally find most illuminating. W. P. Uzer (talk) 14:18, 23 November 2014 (UTC)
I completely agree with your last sentence, but it's not that cut and dried. IMO, an editor has to have a certain amount of understanding of the subject, and that understanding has to be shared by other editors in order for a consensus to be reached. An editor who has little understanding and justifies edits on the basis of "I found million hits on Google, I don't understand it, I can't defend it, but that doesn't matter" is not doing their job. Yes, I have an approach that I find personally illuminating, but what I am asking is for you to consider that approach and see if you don't also find it illuminating, so we can create an informed consensus. If there is a problem with my personally illuminating approach, then lets discuss it, but quoting sources doesn't illuminate anything. At the end of the day, we not only need to be backed up by sources, but we need to have, as much as possible, an informed consensus. PAR (talk) 15:36, 23 November 2014 (UTC)
  • I would like to suggest that entropy could be defined as "a state function which increases autonomously in an isolated system if and only if there remain unbalanced energy potentials" because this embraces the close association with the Second Law of Thermodynamics and the state of maximum entropy (namely thermodynamic equilibrium) in which all unbalanced energy potentials have dissipated. Noting that such energy potentials include gravitational potential energy, we observe thermodynamic equilibrium when there are isentropic conditions in, for example, a planetary troposphere. Such conditions, in order to have no unbalanced energy potentials, would have a homogeneous sum of mean molecular kinetic energy and gravitational potential energy, and thus have opposite gradients in potential energy and kinetic energy, the latter being represented by temperature. 121.216.37.160 (talk) 12:17, 13 March 2015 (UTC)
There are problems with the immediately foregoing new comment. First, the comment is made from a dynamically assigned IP address, and is not signed. It obviously comes from User:Douglas Cotton. He is a recidivist in posting more or less the immediately above comment in various places in Wikipedia. Second, the immediately above comment is hardly relevant to its location here. Thirdly, the comment is wrong in physics. The comment wrongly conflates "energy potentials" with thermodynamic potentials. The comment fails to recognise the import of the first law of thermodynamics, which establishes the meaning of internal energy. The point of internal energy is that it separates the overall potential energy of the system that it has by virtue of its location in an externally imposed force field, such as gravity, from the kinetic energy of the system as a whole moving in an external environment, and from the remaining energy, that is in a sense intrinsic to the system. Thermodynamic potentials relate specifically to the latter as different from the other two energies of the system. It is wrong in physics to muddle the the thermodynamic potentials with the potential energy of the system as a whole by virtue of its location in an externally imposed force field, as the above comment proposes to do.Chjoaygame (talk) 13:46, 13 March 2015 (UTC)
Firstly "unbalanced energy potentials" are what drive the process of maximizing entropy that is described in the Second Law of Thermodynamics. If you read thermodynamic potentials you will see that they specifically ignore changes in gravitational potential energy because they assume that such are insignificant in an "engine" which they are discussing. Of course they are not insignificant in a planet's troposphere. Then you incorrectly assume that I am talking about the macro potential energy associated with a whole object. I am not. I am talking about the interchange of molecular gravitational potential energy and molecular kinetic energy during the free path motion of each and every molecule between collisions. The Second Law is all about the fact that there is an autonomous propensity to increase entropy but only whilst there are still unbalanced energy potentials. When thermodynamic equilibrium is attained that is the maximum entropy state that we have. At the molecular level it means that any pair of molecules about to collide have the same kinetic energy on average at that height. Otherwise there would still be heat transfers by conduction or diffusion. But because molecules gain KE in downward motion and lose it in upward motion, we can deduce that PE loss = -KE gain which amounts to m*g*dH = m*Cp*dT and so we get the temperature gradient at thermodynamic equilibrium as dT/dH = -g/Cp. This temperature gradient is seen in all planetary tropospheres and even in Earth's outer crust for example. It is overridden where there is wind or excessive heat absorption (as in the stratosphere) or excessive inter-molecular radiation as in water. But the inter-molecular radiation between water vapor molecules does have a temperature leveling effect that reduces the magnitude of the gradient by up to about a third, as also happens on Venus (due to CO2) and by about 5% to 10% in the nominal tropopshere of Uranus. The temperature gradient also evolves due to centrifugal force in a centrifuge and vortex tubes which rely on the same principle. There is overwhelming evidence of it everywhere, and over 850 experiments with sealed insulated cylinders have confirmed that isothermal conditions are not the state of thermodynamic equilibrium, simply because molecules have more PE at the top and yet the same KE. If you don't consider such PE in entropy calculations then you have no explanation as to why anything falls anywhere under gravity, thus increasing entropy. — Preceding unsigned comment added by 121.216.37.160 (talk) 00:19, 14 March 2015 (UTC)
My dear Douglas Cotton, I am sorry to find that this matter still troubles you. Your mistake is clearly revealed in the sentence that expresses your real thinking, but that you deleted when it became apparent to you that it gives your game away: "Firstly “unbalanced energy potentials” are the same as thermodynamic potentials." That you believe that is the flaw in your thinking. You have presented this flaw many times here and elsewhere. I do not wish to discuss it further.Chjoaygame (talk) 01:07, 14 March 2015 (UTC)
Internal energy U is a thermodynamic potential and when internal energy in one region of an isolated system exceeds that in another region we speak of there being unbalanced energy potentials, and these are what drive increases in entropy, but only until all such energy potentials are dissipated. That then is the state of thermodynamic equilibrium, it having maximum entropy. So, if that's the best argument you can put forward whilst leaving all the other matters without refutation, I rest my case. You have no evidence that water vapor raises the surface temperature, let alone by about 15 degrees for each 1%, making rain forests 45 degrees hotter than deserts. You have no explanation as to how the required thermal energy gets to the base of the Uranus troposphere to make it hotter than Earth there and you have no counter arguments that in any way refute the fact that the state of thermodynamic equilibrium has a stable density gradient and temperature gradient. The definition I suggested for entropy is spot on, as is the (refined) first statement in my second comment above. This is not the place to discuss the climate debate, although the state of maximum entropy is very relevant. You can discuss such on Roy Spencer's latest monthly temperature data thread, preferably after studying the new website linked therein and endorsed by our group of persons suitably qualified in physics.— Preceding unsigned comment added by 121.216.37.160 (talk) 06:08, 14 March 2015‎
For more information about the relatively recent developments in the understanding of entropy (and thus the thermodynamics of planetary tropospheres) I refer readers to this comment and the linked sites: http://www.drroyspencer.com/2015/03/uah-global-temperature-update-for-feb-2015-0-30-deg-c/#comment-185762 124.184.250.100 (talk) 22:26, 17 March 2015 (UTC)
  • Wikipedia is not competing with Google. Wikipedia seeks primarily to be reliable and intelligible. Google doesn't have the focus on reliability and intelligibility that Wikipedia needs. Editor PAR is concerned to achieve reliability and intelligibility, amongst the myriad sources.Chjoaygame (talk) 22:15, 23 November 2014 (UTC)

evolve

The Oxford English Dictionary lists eight main meanings for the word 'evolve'. It is hardly appropriate that I should try to copy or summarize them here, partly for copyright reasons, considering the length of the Dictionary's entry for the word.

The article currently uses the word as follows

"In thermodynamics, entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously evolve[further explanation needed] toward thermodynamic equilibrium, the configuration with maximum entropy."

The new tag asking for elucidation is a sequel to other related actions, such as for example, here, here, here, here, here, and here.

There has been discussion of this question on this page before now, for example here.Chjoaygame (talk) 03:01, 8 April 2015 (UTC)

tag removed

I removed the tag that was posted here. The tag appears to have been posted because its author felt that the then-standing word 'evolve' was wrong for the context. That word has now been replaced by the word 'proceed'. In removing the tag I do not intend to support the view that 'evolve' was wrong for the context, but merely to respond to its removal.Chjoaygame (talk) 21:23, 24 April 2015 (UTC)

Sometimes we have to compromise. I personally think that either evolve or proceed is satisfactory, but proceed seems to be more acceptable to some editors so perhaps it is better. Dirac66 (talk) 21:42, 24 April 2015 (UTC)

The wikilink in the lead for maximum entropy links to Maximum Entropy Thermodynamics, which appears to be a page about a branch of thermodynamics. I feel like this is misleading... The linked article is not about the concept itself. There seems to be no page on the concept specifically. I recommend the link is removed altogether.

In context: "According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy." A L T E R C A R I 02:22, 20 June 2015 (UTC)

entropy in economics doesn't belong in this article

I feel the new section headed Entropy#Approaches to understanding entropy#Economics should be somewhere else, not in this article. This article announces that it is about entropy in thermodynamics. I think the economics usage of the word 'entropy' has little to contribute to the understanding of entropy in thermodynamics. For the sake of politesse, I am not right now deleting the new section, but I strongly suggest to the nameless IP editor who posted it that he take it somewhere else where it will fit more naturally, perhaps in an article of its own headed Entropy (economics).Chjoaygame (talk) 21:37, 4 August 2015 (UTC)

Broadening article to cover entropy outside classical thermodynamics

The existing Entropy article focused on the thermodynamic meaning of entropy, referring the reader to the disambiguation page for other meanings. To broaden the Entropy article's scope, I have listed the most important of these other scientific meanings (with a brief explanation of their relation, and links to articles on them), at the beginning, noting that the remainder of the entropy article is devoted mainly to the thermodynamic meaning. Yesterday I put essentially this same material on the entropy disambiguation page, but editor Bkonrad removed it on the grounds that it deviated too far from the guidelines for a disambiguation page. The disambiguation page had seemed a logical interim solution, especially given the considerable overlap among several of the existing entropy articles, but I understand the advantages of keeping disambiguation pages brief and narrowly focused. I will now attempt a perhaps more logical solution of putting this explanation of the several kinds of entropy, and the relations among them, at the beginning of the main Entropy article.CharlesHBennett (talk) 20:05, 25 January 2016 (UTC)

Charles, it's a honour to have you editing this page.
But the top of an already pretty tricky article isn't really the best place for a list like this -- it really is better to have it on the separate disambiguation page.
The aim for the top 'lead' section of a page like this is to try to give somebody a summary of the whole idea in a nutshell, to give somebody just an instant answer, and then they can see whether they want to read on or not.
So I have done my best to try to restore a fairly comprehensive list on the disambiguation page. I hope it will stick. It was discussed pretty heavily in the past (on the disambiguation talk page), and in the past it was thought to make sense -- even if it made it somewhat out of the usual run of disambiguation pages.
There obviously is a case to try to overview all uses of entropy on one page, as the Scholarpedia article does. (That seems kind of ironic to be saying to you!!) But in the past the prevailing view has been that it probably makes sense to do a separate article on Thermodynamic Entropy (this one), and on Entropy in Information Theory, and then an article looking at the links between two.
The latter (Entropy in thermodynamics and information theory) isn't as strong as it should be -- in fact none of the three probably are. It's not the easiest of subjects to write about (or even to agree about), and there are probably some important lines of thought missing, or important questions that should have even more articles discussing them. But as a basic structure, it doesn't seem too bad. Jheald (talk) 21:59, 25 January 2016 (UTC)
Perhaps I may add here that I think it is very naughty indeed that this article starts "In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates that may realize a thermodynamic system in a defined state specified by macroscopic variables.". That is about statistical thermodynamics, not thermodynamics simple. First we use macroscopic methods to find the entropy. Then we use that result to guide us on how to count the microstates.Chjoaygame (talk) 04:09, 26 January 2016 (UTC)Chjoaygame (talk) 07:46, 26 January 2016 (UTC)
Chjoaygame - I totally agree. PAR (talk) 07:53, 26 January 2016 (UTC)
Something like the gloss that Dr Bennett gave on the disambiguation page for classical thermodynamic entropy,
a state function originally introduced by Rudolph Clausius to explain why part of a thermodynamic system's total energy is unavailable to do useful work
might be better a better basis for the first line. Jheald (talk) 11:09, 26 January 2016 (UTC)
It's a pleasure to thank the two foregoing editors for their comments. I think it fair to observe that Rankine used the quantity very early (?1851, I am not clued up on the exact dates and priorities), but Rankine called it 'the thermodynamic function'. Clausius invented the word 'entropy' for it I think ?some decade later. Nowadays it is also used to account for the direction (sense) of changes from one thermodynamic equilibrium to another; I think this is important. Also I think the specifically relevant energy is the internal energy, not the total energy.Chjoaygame (talk) 11:47, 26 January 2016 (UTC)
Perhaps a further comment or two. In quantum mechanics, one can work with the Heisenberg matrix, which concerns transitions, or with the Schrödinger wave function, which concerns states. So I think is it in thermodynamics: one can work in the way of the originators, in terms of cycles, consisting of transfers, or one can work in the way of Gibbs, thinking of state variables and functions. So there are two ways of thinking of entropy: as associated with a process or with a state. There exist fictive processes in which entropy is conserved. There are also natural processes, which differ from those fictive (reversible) processes, in that in natural processes, entropy is created from nothing.
In the opinion of Denbigh,
Perhaps one of the most useful verbalisms is 'spread', as used by Guggenheim; an increase of entropy corresponds to a 'spreading' of the system over a larger number of possible quantum states. This interpretation is often more appropriate than the one in terms of mixing when the irreversible process in question is concerned less with configurational factors than with a change into a state where there is a greater density of energy levels.
Guggenheim, Research, 2 (1949), 450.[1]
A relevant quote from Guggenheim[2] is
To the question what in one word does entropy really mean, the author would have no hesitation in replying 'Accessibility' or 'Spread'. When this picture of entropy is adopted, all mystery concerning the increasing property of entropy vanishes. The question whether, how and to what extent the entropy of a system can decrease finds an immediate answer.
Another relevant quote from Guggenheim is
It is usually when a system is tampered with that changes take place.
  1. ^ Denbigh, K. (1954/1981). The Principles of Chemical Equilibrium. With Applications in Chemistry and Chemical Engineering, fourth edition, Cambridge University Press, Cambridge UK, ISBN 0-521-23682-7, p. 56.
  2. ^ Guggenheim, E.A. (1949). 'Statistical basis of thermodynamics', Research, 2(10): 450–455.
I would comment that tampering usually means making some of the walls less obstructive, thus allowing the components of the respective constituent subsystems easier access to the other constituent subsystems. That allows the spread.Chjoaygame (talk) 17:29, 27 January 2016 (UTC)

Disambiguation page

PLEASE IGNORE THIS SECTION - I didn't save it - Wikipedia did an auto save whilst I was working on it. The section as I wanted it is the one below. Aarghdvaark (talk) 00:13, 28 January 2016 (UTC)

Hi - I reverted back to include the list of other meanings in the article rather than the alternative meanings being in a disambiguation page. A disambiguation page is not the place for an explanatory list: you go to a disambiguation list if you don't know what you are looking for exactly, not to get more information. These explanations belong in this section as the lede says "This article is about entropy in physics and mathematics" and many of the explanations in question are exactly that. But I'm not that happy with the list and the way it is presented in this article either! Possibly this whole article should be put on a new page "Thermodynamic entropy" and this page "Entropy" becomes a discussion of the general concept of entropy with brief explanations of the various types and wiki links to the relevant pages? So my proposal is this page's content moves to "Thermodynamic entropy" and new content based on the list is put under this page's title "Entropy". What do you think? Aarghdvaark (talk) 23:40, 27 January 2016 (UTC)

I really do think the disambiguation page does this better -- it precisely is telling you where to go, to find the article discussing what you're looking for.
I am happy to discuss the different structures that are possible for our articles, but for the moment this is a topic that gets a lot of page views, so unless/until we definitely have something better that's ready to run, I think we should leave the article in the most usable state possible given the current set of articles.
Which is why I am reverting. As I wrote in the section above, throwing a list like this straight at the unprepared reader is not helpful. There may be better ways to organise our articles, which we may be able to get to; but a list like this, presented cold at the top of the article, is not. There are good reasons why WP:LEDE calls for as good an overview as can be given as soon as possible of the topic, in prose. Jheald (talk) 00:04, 28 January 2016 (UTC)
THIS SECTION IS CLOSED - PLEASE USE THE SECTION BELOW. The problem was I was trying to change the title of the new section as I was writing it. Wiki did change the title, but unbeknown to me it secretly saved it! Aarghdvaark (talk)

poor quality

I have just seen a disambiguation link edited into this article. I think it is misleading or wrong, but the quality of this article is so generally poor that I don't want to get involved in trying to fix it.Chjoaygame (talk) 19:38, 18 February 2016 (UTC)

Mathematical definitions of Negentropy

I think that the article should consider defining Negentropy Mathematically. It is considered that Negentropy would literally be the NEGATIVE of the entropy change associated with a process, but having an explicit expression for such would be good. ASavantDude (talk) 21:00, 9 April 2016 (UTC)

Disambiguation page is wrong place for explanations and this article had no lede

Hi - I reverted back to include the list of other meanings in the article, rather than the alternative meanings being in a disambiguation page. A disambiguation page is not the place for an explanatory list: you go to a disambiguation list if you don't know what you are looking for exactly, not to get more information. And these explanations belong in this article as the introduction specifically says "This article is about entropy in physics and mathematics" and many of the explanations in question are exactly that. But I'm not that happy with the list and the way it is presented in this article either!

The article, before I reverted it,[1] dived straight in and said "In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates that may realize a thermodynamic system in a defined state specified by macroscopic variables." So this article would appear to be specifically about Entropy (classical thermodynamics), which has its own page - which would lead to the suggestion that the two pages Entropy (classical thermodynamics) and this page Entropy be merged, although currently Entropy (classical thermodynamics) already has a merger discussion going on with Entropy (statistical thermodynamics)!

The problem is that the the old article[2] has no lede - diving straight in with a gobbledygook definition is not how an encyclopedia should do things (apologies to the editor who wrote the article's first sentence). This list (I'm not the author) is at least the start of an introduction we can work on. What do you think? Aarghdvaark (talk) 00:10, 28 January 2016 (UTC)

By WP:BRD please don't edit war. If somebody objects to your change, then discuss it here, and see what the consensus of other editors is, don't keep on trying to push your new version on the article page. Jheald (talk) 00:16, 28 January 2016 (UTC)
I'm not edit warring - but I do find it ironic how often people warn or complain about other people doing what they do! You've reverted this page twice now :) Aarghdvaark (talk) 00:26, 28 January 2016 (UTC)
(ec) As to the proper structure for our various articles on entropy, as I wrote above, Scholarpedia gives an example of what an article that gave a WP:SUMMARY style overview of all the different possible meanings of entropy might look like.
However in the past, chemists in particular have objected that thermodynamic entropy is (they believe) the original, fundamental meaning of the word; also the one that students are most likely to be introduced to first; and one that they believe for many readers will be the only meaning of entropy that a large proportion of readers are coming to read about. So that's why entropy in the sense of physical systems has historically had "principal topic" status for the term.
I'm not saying that's right; but before changing it, one should have a RfC, and see what editors from the various relevant wikiprojects think -- Maths, Statistics, Physics, Chemistry, Engineering, Computer Science... -- rather than jump in and make a radical change.
Even with the scope that this article aims to cover entropy of physical systems (rather than e.g. mathematical entropy of abstract probability distributions), that's still a lot to present people with a coherent overview of -- the classical thermodynamic picture, the microscopic statistic mechanical picture, how that stretches to take quantum systems; as well as how that understanding relates to closely related topics such as available work, the second law, etc. So that's what the article tries to do, reflecting that for many people this is the primary meaning of the word. Jheald (talk) 00:34, 28 January 2016 (UTC)
The article as is dives straight in with "In thermodynamics, entropy (usual symbol S) is a measure of the number of specific realizations or microstates that may realize a thermodynamic system in a defined state specified by macroscopic variables". I think many readers would simply go away at that stage even if they were looking for Entropy (classical thermodynamics)! Anyone who can make sense of the first sentence wouldn't need the rest of the article!
And amazingly the disambiguation page doesn't link back to this page (except at the top for Entropy which the reader probably wouldn't link to since if they are on that page they will be looking for the relevant page for them. So, assuming most readers will glaze over when they land on this page and read the first sentence, and it is not properly linked to in the disambiguation page, who is this page actually for? Can you put a proper link to this page on the disambiguation page please - to explain why a reader would want to look at it and to differentiate it from the other entropy entries, especially the thermodynamic entries? Aarghdvaark (talk) 01:08, 28 January 2016 (UTC)
I think this is very knotty problem. Loosely and broadly speaking, I favor the approach of Editor Jheald here.Chjoaygame (talk) 05:25, 28 January 2016 (UTC)
@Aarghdvaark The thinking, I believe, is that most people won't get to the disambiguation page unless they have come here first; therefore that's why in general on dab pages the "primary meaning" only gets mentioned once at the top. (See MOS:DABPRIMARY).
As for the first sentence, see discussion two sections up, where Chjoaygame has some thoughts on this. Jheald (talk) 10:09, 28 January 2016 (UTC)

--- THIS discussion is a good example of why editing Wikipedia is a mess and why I do not do it. You have Bennett who is one of thee top 2-3 experts in the world try to clarify things and he gets deleted by a group of Yo-yos based on dumb technicalities. I do not know Bennett in person but have seen his work outside wikipedia. As long as this type of behavior exists on Wikipedia sane scientists are wasting their time doing things here. What a mess. — Preceding unsigned comment added by An UtterMess (talkcontribs) 21:04, 15 April 2016 (UTC)

New section on Economics

On 20 April 2016, user:174.3.155.181 reverted the section added by me on Economics. In the edit summary, the user offers two reasons for the deletion:

  • Large new section in the article aren't welcome without discussion.
  • Entropy cannot be measured nominally in economics.

I think both of these reasons have to be put aside. Consider the following:

  • I disagree with the negative assertion that 'large new section aren't welcome without discussion', as this is not in agreement with the WP BOLD, revert, discuss cycle. Nobody have to ask anybody for permission in advance before they add material anywhere. I think this assertion made by user:174.3.155.181 is unreasonable and somewhat aggressive.
  • It is true that entropy cannot be measured nominally in economics (that is, cardinally), only ordinally: Whereas it is not possible to measure the exact magnitude of an economy's entropy at a point in time, it is possible to state that entropy steadily increases with time. Georgescu-Roegen himself was well aware of this issue.[1]: 353  However, this inadequacy is a poor excuse for deleting the section, I say. WP do not settle or disregard disputed issues; we document issues, whether the issues are disputed or not. Some of the mistakes made by Georgescu-Roegen are already documented here, although not the issue of the lack of a cardinal measure (this could be included, provided that proper sources are available). The fact remains that due to Georgescu-Roegen's work, the laws of thermodynamics now form an integral part of the ecological economics school, and a full chapter on the economics of Georgescu-Roegen has approvingly been included in an elementary physics textbook on the historical development of thermodynamics,[2]: 95–112  as already stated in the section text itself. Georgescu-Roegen's work is currently discussed and criticised by physicists and others, but this is merely another good reason for documenting everything, I say. If WP documented only undisputed issues, it would be a small encyclopedia indeed...
I would like to expand on this last point. There is no cardinal measure of entropy in economics, true; but there is no cardinal measure of entropy in cosmology, either! Hence, in the section before the one on Economics, namely Cosmology, one statement goes that 'The role of entropy in cosmology remains a controversial subject since the time of Ludwig Boltzmann'. In addition, the article on the heat death of the universe, linked to as the main article from the top of the Cosmology section, contains a section on Current status. In that section, the reader learns that Planck, Grandy and several other scholars in the field are (have been) critical of the concept of 'The entropy of the universe'. Now, if user:174.3.155.181 had his (her) rash and destructive way, all of this text would have to be deleted at once due to the imprecise definition of the article subject matter. Fortunately, the text is still around for everybody to read.

References

  1. ^ Georgescu-Roegen, Nicholas (1975). "Energy and Economic Myths" (PDF). Southern Economic Journal. 41 (3). Tennessee: Southern Economic Association: 347. doi:10.2307/1056148.
  2. ^ Schmitz, John E.J. (2007). The Second Law of Life: Energy, Technology, and the Future of Earth As We Know It (Link to the author's science blog, based on his textbook). Norwich: William Andrew Publishing. ISBN 0815515375.

Consequently, I have undone the reversion made by user:174.3.155.181, and added a template to indicate that the content is disputed and currently being discussed on this talk page. In order to establish consensus, please DO NOT delete the section once again before the subject matter has been discussed, and my points made above have been considered by user:174.3.155.181 and other editors. Third opinions are welcome indeed. Thank you. Gaeanautes (talk) 15:20, 25 April 2016 (UTC)

It would be good if ip user 174.3.155.181 gave himself a user name. It is a pain to talk to an IP address.
I agree with ip user 174.3.155.181 that the section should be deleted. I also agree with Editor Gaeanautes that the section headed Cosmology is faulty. I think it should be deleted. Two wrongs don't make a right. The section headed Cosmology starts with the sentence "Since a finite universe is an isolated system, the Second Law of Thermodynamics states that its total entropy is constantly increasing." That is a good example of nonsense arising from a quick and would-be clever statement of the second law. I haven't studied the new edit closely, but at a glance I see it as drivel. Elaborate and grandiose, but still drivel. I can see that Editor Gaeanautes likes the material he has posted, and is likely to put it back if I remove it now. Since Editor Gaeanautes asks for reasons, I will say that it is pretentious nonsense to claim that the laws of thermodynamics have a place in economics as such. If there is anything of value in Editor Gaeanautes' post, it belongs somewhere else, perhaps in some article on economics, or on the ecological economics school, but not in this article. The post does not illuminate the concept of entropy. To avoid edit war, I hope someone else will comment here.Chjoaygame (talk) 17:08, 25 April 2016 (UTC)

Thanks to Chjoaygame for his swift response to my post. I would like other users to join in anytime. We all want to avoid an edit war, as Chjoaygame carefully points out. While I completely agree that it is a pain to talk to an IP address — that is, talking to some subtle entity identified by the number 174.3.155.181 — I have the following three critical remarks to Chjoaygame's post:

  • I am rather baffled by the fact that users working on this article are so bent on dismissing and deleting new sections right away. As one explanatory supplement to WP's NPOV policy has it, "... there is usually no need to immediately delete text that can instead be rewritten as necessary over time..." Righto. Without even having studied my new edit closely — so he frankly admits — Chjoaygame flatly dismisses it as 'elaborate and grandiose drivel' and 'pretentious nonsense'. Arh, come on! Why this negative talk? I think this subject matter merits a closer inspection by other users before they take a stance and a proper consensus can be established. Hence, everybody should take a look at the WP article sections on the methodology of ecological economics and especially the controversies generated by Georgescu-Roegen's work, before they carefully read the three following online articles on entropy and economics:[1]: 21–28  [2] [3] In addition, users should also familiarise themselves with the sources referenced from the section text itself before taking a stance, if that is not too much to ask for...

References

  1. ^ Cleveland, Cutler J. (1999). "Biophysical Economics: From Physiocracy to Ecological Economics and Industrial Ecology". In Mayumi, Kozo; Gowdy, John M., eds. (eds.). Bioeconomics and Sustainability: Essays in Honor of Nicholas Georgescu-Roegen. Cheltenham: Edward Elgar. ISBN 1858986672. {{cite book}}: |editor2-first= has generic name (help); External link in |chapterurl= (help); Unknown parameter |chapterurl= ignored (|chapter-url= suggested) (help)CS1 maint: multiple names: editors list (link)
  2. ^ Kåberger, Tomas [in Swedish]; Månsson, Bengt (2001). "Entropy and economic processes — physics perspectives" (PDF). Ecological Economics. 36. Amsterdam: Elsevier: 165–179. doi:10.1016/s0921-8009(00)00225-1.
  3. ^ Hammond, Geoffrey P.; Winnett, Adrian B. (2009). "The Influence of Thermodynamic Ideas on Ecological Economics: An Interdisciplinary Critique" (PDF). Sustainability. 1. Basel: MDPI: 1195–1225. doi:10.3390/su1041195.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  • Users like Chjoaygame who believe that "... it is pretentious nonsense to claim that the laws of thermodynamics have a place in economics as such" will have to consider the simple fact that some scholars positively do make this claim while other scholars seriously discuss and criticise it. Need I repeat myself and state once again that WP do not settle or disregard disputed issues; we document issues, whether the issues are disputed or not? Well, now I have repeated myself, just in case!
  • Chjoaygame asserts that if there is anything of value in the section on Economics added by me, it "... belongs somewhere else, perhaps in some article on economics..."; but the section on Economics is already linking to 'somewhere else' in WP, namely to Ecological economics and to Georgescu-Roegen. More to the point, both sections on Cosmology and Economics — currently tagged by me — are subsections to the section on Interdisciplinary applications of entropy, a section that begins by stating that "Although the concept of entropy was originally a thermodynamic construct, it has been adapted in other fields of study..." So, the very purpose of this section — including the subsections — is to present these other interdisciplinary fields of study, and provide the relevant linking as well. This is the usual WP practise of 'building the web' by using hypertext, thereby adding to "... the cohesion and utility of WP, allowing readers to deepen their understanding of a topic by conveniently accessing other articles", as it goes. In effect, the section on Economics added by me is placed right where it belongs, I say.

I too hope that other users will comment here. Thank you. Gaeanautes (talk) 09:19, 28 April 2016 (UTC)

Thanks, Editor Gaeanautes, for your reply. Perhaps I should add a little. I have good reason for my summary dismissal. There are scads and scads of sloppy drivel in would-be physics writings about entropy. 'Entropy' is a hand-waving cover-all for sloppy pseudo-thinking in many of the occasions when the term is uttered. Von Neumann suggested to Shannon to call his quantity of 'information' by the term 'entropy' because "it will give you the advantage in any argument because no one will know whate you are talking about". It would be possible to swell Wikipedia with reams of chatter about whether von Neumann was right or whether Shannon was right to take his advice. And to argue on 'policy' grounds that this was valuable. At a glance, one can know that entropy is not a natural for economics. Yes, one can say 'Oh, that's prejudice in the extreme. How bigoted!' And so on. I have now glanced at Hammond & Winnett. It is drivel. Fancy, high-falutin', elaborate, pretentious drivel. Yes, it looks like rational thought, and one could argue that it should be taken seriously, but one would be building castles in the air. I don't quite know how to get this message across, but perhaps I can suggest that entropy proper describes bodies that are homogeneous and in permanent states of zero flux, while economics is about networks of non-stationary fluxes. It is a fond dream of many physicists that entropy can describe non-equilibrium systems, but sad to say, the real intellectual work for that dream has hardly begun. To think that economics can jump the gun on this is preposterous. I think what I say here will seem bigoted, ..., ..., whatever, whatever, until the reader comes to grips with the nitty-gritty of thermodynamics. I recall it was V.I. Arnol'd who wrote that 'every mathematician knows that it is impossible to understand an elementary course in thermodynamics'.
It is probably a good idea that now I should translate the just foregoing paragraph into Wikiese. Where I used pejorative epithets such as for example 'drivel', good Wikiese would say 'material that is based on unreliable sources and is not notable'. It is possible to write endless reams based on muddled extrapolations from the second law of thermodynamics, but the reams are not notable, and the writers are not reliable. It is not the task of Wikipedia to document the vagaries of such adventures into waffle, because the waffle itself is not notable.
Editor Gaeanautes is of good faith and good will, but he/she does not advance basic arguments for the material of his/her post. He/she offers just second-hand pseudo-authority. One has in the past read much material of that kind, because there is lots of it. Wikipedia does not mistake quantity for quality. It is not the task of Wikipedia to examine, analyze, and report on unreliable and unnotable sources. Some judgment is needed to select reliable sources.
No amount that I write here will do the trick. A certain amount of real understanding is needed. Again, other commenters would help here.Chjoaygame (talk) 12:03, 28 April 2016 (UTC)

Indeed, user Chjoaygame — no amount you write here will do any trick. So, stop yelling 'Drivel!' at everybody and everything, and let other users have a say. Gooood. Thank you very much. Gaeanautes (talk) 12:42, 29 April 2016 (UTC)

By the time of writing, it seems 'other users' are not interested in joining the discussion. If nobody show up, I'll remove the two {disputed content} templates from the article in a couple of weeks or so. Gaeanautes (talk) 14:57, 19 July 2016 (UTC)

The two {disputed content} templates have now been removed. If this removal is reverted, please state the reason for the reversal in a post below. Thank you. Gaeanautes (talk) 16:37, 15 August 2016 (UTC)

Hello fellow Wikipedians,

I have just modified 4 external links on Entropy. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 20:11, 11 September 2016 (UTC)

Put this article in layman's terms

This article is unnecessarily incomprehensible. The whole reason people look up things on wikipedia is because they don't have expertise on the subject and they want to have a general idea of what's going on with the thing they're looking up. Having an article written by experts is a good thing... unless they're completely incapable of communicating anything meaningful to the public. A physicist isn't going to look up entropy on this website. Your average Joe is. And when Joe comes to look up entropy, he's going to walk away from this article less informed than he came in, because he has no idea what's being said.

People come to wikipedia because they don't have a background in the topic, otherwise they'd be consulting academic papers directly. — Preceding unsigned comment added by CharlieBob (talkcontribs) 07:34, 18 November 2016 (UTC)

There is a very clear link at the very top of the article to introduction to entropy. Please review to see whether it meets your requirements for reading level. --Izno (talk) 13:59, 18 November 2016 (UTC)
I agree with this proposition by CharlieBob. I understand the article, but it is stupid to even have this introduction and article the way it is. This is an encyclopedia, not a forum for you to practice your poorly attained mental gymnastics in a show of arrogance by someone unqualified to speak on the topic. Start your own Experts-Wiki, so that you can go demonstrate your lack of understanding of the topic somewhere else. Delete this page and put it on a different forum.

Analogy with calculus

The slope of a straight line is both definable and explainable with nothing more than Book I of Euclid, c. 350 BC. As soon as you ask about the slope of the parabola x² you enter much deeper territory that took nearly two millennia to sort out. Yes you can define it as 2x, but by itself that's just a formula, not an explanation.

The clearest elementary explanation I know is that when x = a then 2a, the value of 2x at a, is the slope of the tangent to x² at x = a. Other explanations struggle with the meaning of dy/dx as what you get when you divide dy by dx. Do you talk about epsilon-delta, infinitesimals, or what?

Reading between the lines of this talk page it seems to me that those who want to start the article by taking S = k ln(W) as the definition of entropy in preference to dS/dQ = 1/T are the counterparts of those who find dy/dx as the definition of slope to be too intimidating for the reader looking for a simple definition of the slope of a curve. Is the notion W of the set of possible configurations of say a locomotive boiler really all that much easier to grasp than the notion of dS/dQ? And when the notion of the boiler's absolute entropy entails reference to its ground state there is something seriously wrong with the pedagogy of that explanation of entropy.

So why not take a leaf out of the book of those who define slope in terms of tangent?

The counterpart of a tangent at a point on a curve would involve the instant when a cooling body is at temperature T. How does this read?

When a body is exchanging heat with its environment at temperature T, the entropy it is gaining or losing is the fraction 1/T of the heat it is gaining or losing. It follows that when a cold body acquires heat from a hot one, more entropy is acquired than lost whence there is a net gain in entropy. The SI units of heat being joules J and of 1/T K⁻¹, this makes the SI units of entropy JK⁻¹ or J/K.

For the benefit of those more advanced readers familiar with the concept dy/dx one can follow up this slightly informal definition with something like the following. "This can be put more succinctly in the language of calculus as the infinitesimal ratio dS/dT = 1/T, bearing in mind that losing heat typically entails losing temperature as well."

I'm not sure where best to fit it in here, but in trying to motivate the connection between entropy and information without a detour into the nature of W and ln(W), I like to divide top and bottom of dS/dT by dt giving (dS/dt)/(dQ/dt) = 1/T. Obviously dQ/dt is power, but what is dS/dt? If you take it to be data rate then the formula is saying that for a given transmission power data rate is inversely proportional to temperature. Conversely if you write it as dQ/dt = T dS/dt then it says that for any given data rate dS/dt the hotter it gets the more power is needed to sustain that rate. (Which can lead to thermal runaway, to which throttling down the CPU is one viable response to overheating.) Then quantify bit (or nat) by bringing in Boltzmann's constant, and perhaps give examples of how closely this upper limit on bit rate has been achieved in practice, e.g. in nerves. At that point the role played by W and ln(W) for nerves can be brought in. Vaughan Pratt (talk) 10:45, 13 May 2017 (UTC)

Switch to Probabilistic definition to start with

Hello, It's tough to start with the multiplicity definition of Entropy, S=kb*Log[W], instead of calculus based definition. Since the multiplicity is only valid for equiprobable states anyway, it might be a decent idea to move to the probabilistic definition, S=-N*kb*Sum[p_i* Log[p_i],{i}], where p_i is the probability of the ith state occurring in the system, and N is the number of same elements (you could include a summation over N for more than one non-interacting element to make it more general). To my knowledge, this definition is valid for large numbers of elements (I'm not sure about small), and it is more general than the multiplicity definition since it takes into account systems whose states are not equiprobable. The probabilistic definition is also listed first as the definition of entropy (actually without the N out front) in another article in wikipedia on shannon entropy before it mentions the statistical/multiplicity definition. I would recommend following that model. See https://en.wikipedia.org/wiki/Entropy_(information_theory). Thanks, Evan N. PS I'm a fourth year physics Ph.D. grad student and this is a tough subject for anyone. — Preceding unsigned comment added by RevvinEvan (talkcontribs) 20:07, 20 May 2017 (UTC)

Who is this page for?

As a non-scientist interested in science, I came here to get a general understanding of topics like entropy and the laws of thermodynamics. Instead I found physicists talking in jargon and incomprehensible equations shoved in my face almost immediately.

ENCYCLOPEDIAS are, by tradition, overviews of knowledge for non-specialists. Specialists wouldn't bother with encyclopedias, as their knowledge is already far beyond the "overview" stage. Telling me that there's a page on "introduction" to the topic doesn't help, because (1) I didn't notice such a link on the main page, only finding out about it here in the Talk section, which most people never visit; and (2) an encyclopedia is supposed to be an introduction, in general.

This page is pretty useless both for scientists (too general) and non-scientists (too incomprehensible). It's probably just a place for specialists to show off their knowledge to... nobody.

By the way, I'm a very experienced tech writer. I may not know much about physics, but I know a lot about communicating arcane knowledge to different levels of audience. This page... and the related pages on thermodynamics... are a complete fail. — Preceding unsigned comment added by 72.93.49.191 (talk) 22:02, 15 August 2017 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Entropy. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 03:34, 12 January 2018 (UTC)

Confusing lede section

Maybe you science-y types can understand this, but a layman such as myself has absolutely no idea what you're talking about here. Truly baffling. The best way to explain a difficult concept is to start in terms that people know -- please use a simple metaphor or ideas to explain this difficult concept.--Tomwsulcer (talk) 12:20, 23 May 2017 (UTC)

True, entropy is a very abstract concept and understanding it relies on a prior understanding of some other concepts. For this reason our article on entropy is supported by another: Introduction to entropy. Remember Wikipedia is an encyclopaedia, not a text book. Dolphin (t) 23:01, 23 May 2017 (UTC)
I absolutely agree with Tomwsulcer. The section is nothing more than a bunch physics undergrads using Wikipedia to 'intellectually masturbate' - that is, to stimulate oneself by spraying overly elaborate/technical knowledge on to others for no other reason than to gain gratification from one's displayed superior knowledge... Yes Wikipedia is an encyclopaedia, so it should aim to easily impart knowledge to the 'everyman' as encyclopaedias are supposed to do, and not just to college physics students. As it stands, the opening section reads more like an advanced physics text book than a Wikipedia section - it tells the vast majority of readers absolutely nothing. Imagine if Wikipedia pages on Ancient Egypt were written in hieroglyphs... it would tell college students majoring in Ancient Egyptology heaps; everyone else with an interest, however, would come away learning f*** all. In such a case, it simply wouldn't be acceptable to be told - "Read our article: an introduction to hieroglyphics." M R G WIKI999 (talk) 12:24, 28 May 2017 (UTC)

Thanks to User:Waleswatcher for helping fix this, the body of the article is still too great in scope but it's a lot better now as the lede is so important for new readers on the topic. We should now focus on what else to cut in favour of a better article, for example how deep do the Cosmology and Information Theory sections need to be if we have a disambiguation? Should there be an "Introduction to Entropy" article, isn't that admitting failure? 82.2.57.14 (talk) 14:26, 27 April 2018 (UTC)

Thank you, and thanks for your own edits that inspired mine. Regarding the body, I haven't read it all carefully. Are the sections that look especially problematic to you? As for introduction to entropy, I think I agree with you on general grounds. Also that article is not particularly good, and indeed if it's needed at all it makes more sense to me to have it as a section in the entropy article. What certainly does belong somewhere is a nice, intuitive example that helps people understand what entropy is, why it's so large numerically for macroscopic systems, and why it (almost) always increases. I could easily put something like that together if it isn't already done somewhere. Waleswatcher (talk) 15:46, 27 April 2018 (UTC)
If we submit a merge request with introduction to entropy and break up section 6 then I think that's the best start. Then this article is responsible for both audiences so there's no cop out. 6.1 can join the definitions, a lot of 6.2 onwards can be merged down into 7 because "Applications" and "Approaches to Understanding" mostly mean similar things as presented here. Do you have the authority to submit the merge? I that should happen first. 82.2.57.14 (talk) 19:28, 27 April 2018 (UTC)
I don't have any "authority" (other than being a registered user) and I've never submitted a merge request - but anyway maybe we should see if other editors have comments on this idea first. I'll create a new section here and in that article with that suggestion and ask for comments. Waleswatcher (talk) 20:56, 27 April 2018 (UTC)

User:82.2.57.14 has the following suggestion: "If we submit a merge request with introduction to entropy and break up section 6 then I think that's the best start. Then this article is responsible for both audiences so there's no cop out. 6.1 can join the definitions, a lot of 6.2 onwards can be merged down into 7 because "Applications" and "Approaches to Understanding" mostly mean similar things as presented here." I think this is a good idea. Thoughts? Waleswatcher (talk) 20:58, 27 April 2018 (UTC)

I wish we had a better article for Introduction to entropy. Entropy is a complicated subject and students and layman are always looking for an interpretation and an intuition about it (comparable with relativity and quantum mechanics). In the better of worlds, we would TNT Intro to Entropy. Yet I won't oppose a merge if in that way we may boost the entropy article. --MaoGo (talk) 10:04, 28 April 2018 (UTC)
Oppose: I changed my mind due to the conversation below Talk:Entropy#Proposed merge with Entropy (energy dispersal). Introduction to entropy is fundamental to understand why we have so many PHYSICS articles under entropy (disambiguation). --MaoGo (talk) 08:54, 2 May 2018 (UTC)
Oppose: The proposal to eliminate the article Introduction to entropy, or merge with another article, has been seen at least twice before. I am opposed to it. Wikipedia has a number of Introduction articles supporting complex scientific concepts. These are valuable articles that serve a different purpose than providing a rigorous and comprehensive explanation. Dolphin (t) 21:53, 2 May 2018 (UTC)

Proposed merge with Entropy (energy dispersal)

Due to factual accuracy and contradiction dispute, merging with slight alteration might be a fix. Kirbanzo (talk) 04:00, 1 May 2018 (UTC)

Oppose: The Entropy (energy dispersal) article needs expert attention before any merge may be made. Also, as Entropy (disambiguation) shows, we are currently keeping every interpretation of entropy in a separated article. --MaoGo (talk) 08:51, 2 May 2018 (UTC)
Oppose. The article seems to be more about educational issues than entropy itself. That is, what is the right way to teach the subject? Perhaps it should be renamed to Teaching of entropy or something similar. SpinningSpark 09:10, 2 May 2018 (UTC)
Spinning. A good portion of the Entropy (order and disorder) as it is currently written is also about the pedagogy. Maybe we could merge them both into that article. TStein (talk) 21:47, 4 May 2018 (UTC)
Good idea, but I'm not the one to do it. Education is not my thing. SpinningSpark 21:56, 4 May 2018 (UTC)

Law states vs. law dictates

I undid this edit by user GeorgeEatonIII (talk · contribs), as the literature seems to disagree. For instance:

Google Scholar Books
"newton's second law states that" 324 4040
"newton's second law dictates that" 10 11

Leaving out the word that gives similar results. - DVdm (talk) 18:03, 18 June 2018 (UTC)

"caloric" (what is now known as heat)

whenever "caloric" (what is now known as heat) falls through a temperature difference . .

I would have thought a statement of this kind would need a recent source.--Damorbel (talk) 11:35, 28 May 2019 (UTC)

Extensive / Intensive confusion

Okay so the article starts out by saying "In statistical mechanics, entropy is an extensive property of a thermodynamic system." but then 5 short paragraphs later it says "The entropy of a substance is usually given as an intensive property—" I'm not a scientist so this could be my problem but this stands out and is confusing. It seems like there is a mismatch or something isn't being explicitly differentiated. — Preceding unsigned comment added by Anthonybakermpls (talkcontribs) 19:39, 19 April 2019 (UTC)

@Anthonybakermpls: I agree that, at first glance, it appears confusing. Properties of thermodynamic systems, such as entropy, internal energy, and enthalpy, are always extensive properties because they depend on the mass of the system, and that could be 1 kg, 100 kg, 10,000 kg or more. Five paragraphs later is mention of the entropy of a substance being an intensive property – measured per unit of mass. For example, if the substance is saturated steam at a given pressure the entropy can be specified in units of joules per kelvin per kilogram. (The same applies to internal energy and enthalpy of saturated steam at the given pressure.) So what is the extensive entropy of saturated steam at the given pressure? That depends on the mass of steam we care to nominate. Dolphin (t) 10:13, 9 August 2019 (UTC)

Issue: Definitions and descriptions > Statistical Mechanics

In the discussion of the definition of entropy from a statistical mechanics point of view, the definition of entropy as "the expected value of the logarithm of the probability that a microstate will be occupied" is given. The equation that is given is incomplete, as the right hand side of the equation is indexed while the left hand side is not. I am not aware of the correct form of the equation. This message is simply to bring attention to this issue.

Jcr137 (talk) 19:42, 19 August 2019 (UTC)

Explain the lede

Not planning to review the article myself, but it would be good if the lede is written so that it's accessible to a broad public (criterion 1a, and specifically WP:EXPLAINLEAD). A lot of people that have never really opened a physics textbook have heard of this term and might come to Wikipedia for to learn more. As far as possible, the first paragraph should not contain ANY words that are only known to physics undergraduates. Femke Nijsse (talk) 08:29, 12 September 2019 (UTC)

GA Review

GA toolbox
Reviewing
This review is transcluded from Talk:Entropy/GA1. The edit link for this section can be used to add comments to the review.

Reviewer: David Eppstein (talk · contribs) 08:27, 29 November 2019 (UTC)

I don't think this is ready for GA nomination. Much of the lead is not a summary of later material (WP:GACR 1). This material is not sourced, almost all of the history section is unsourced, the starting subsection of the definitions section is unsourced, most of the Carnot cycle subsection is unsourced, much of the classical thermodynamics subsection is unsourced, several whole paragraphs of the statistical mechanics subsection are unsourced, and several paragraphs of the entropy of a system subsection are unsourced. The first two paragraphs of the second law of thermodynamics section are unsourced, etc. etc (GACR 2). The article is long, and although it does use summary style it may possibly be overdetailed in parts; the section on hermeneutics seems completely off-topic (GACR 3). Given all this, I think this is sufficiently far from meeting the good article criteria to be a quick fail (WP:GAFAIL #2). Additionally I don't think the Shannon-von Neumann quote meets the standards of clearly stating the attribution of the quote (was it Shannon or von Neumann) or for formatting quotes, and the equation formatting in the "Interdisciplinary applications of entropy" section is just gross. No prejudice against re-nomination but only after a serious effort to make sure that everything is at an appropriate level of detail and everything is sourced. —David Eppstein (talk) 08:27, 29 November 2019 (UTC)