Jump to content

Talk:Entropy/Archive 12

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 5Archive 10Archive 11Archive 12Archive 13Archive 14

Reversible vs. irreversible

The article so far fails to distinguish clearly between entropy changes due to a reversible process and an irreversible process. The two are usually conceptually distinct and the article should strive to provide an example of each so that the distinct is clear to the reader.--212.3.132.195 (talk) 13:35, 27 January 2013 (UTC)

I agree. There seem to be some common misconceptions about what a reversible process is, and why most processes are irreversible. There is nothing in the 2d law that says a closed system cannot spontaneously run in reverse. All that is needed is to add all of the work to the waste. However, the universe is supposed to be the only closed system. Therefore, it is entirely possible that the universe will, one day, stop expanding, collapse in on itself and --Bang-- form the universe all over again. However, the universe is simply not running in that direction right now. Within the universe, there are only open systems. For example, I can gather a certain amount of air and compress it. This will concentrte the heat energy, adiabatically raising its temperature. This is a reversal of entropy. However, it is only a small reversal in a very large river. In actuality, energy had to be added to the compressor through the motor, and the motor's electrical energy (no matter how it was made) came from the sun on it's way to the universe. The compressor is simply a small reversal along its way. Some of the energy from the compressor cannot wait, and will disperse into the universe. Therefore, if you try to power the compressor by relesing the air into an air motor, you will never be able to pump the compressor up. Ultimately, the energy is on a one-way trip, and any reversal is simply like an eddie current in a river, being small and having little effect on the overall flow. Zaereth (talk) 21:47, 29 January 2013 (UTC)

Restructure

There is a serious structural problem with this article. Most of the material in this article should be moved to Entropy (classical thermodynamics) and this article should then be redirected to Entropy (disambiguation). That is the only way to make plain the other subject areas of entropy from information theory, quantum mechanics, etc. No wonder this article has been so confused and has failed to find focus.--190.204.70.243 (talk) 07:18, 9 January 2013 (UTC)

I second that motion.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
18:01, 9 January 2013 (UTC)
For an article such as this, I would rather see it become a parent article rather than a simple DAB page. DABs are great for completely different subjects that simply share the same name, like Mercury (planet) and mercury (element). Here we have different aspects of the same subject. Personally, I think it's better to briefly summarize them in the parent article, and provide "main article links" to the subordinate articles. A similar example is the potential energy article, which is the parent article of gravitational potential energy, nuclear potential energy, and so on. I think this subject can benefit from having a similar parent article, provided we can make it much more readable. (Actually, I think this article could also use an introduction section, briefly summarizing the introduction to entropy article, as well.) Zaereth (talk) 18:29, 9 January 2013 (UTC)
A lot of other languages point to this page. Redirecting it might be disruptive. Almost all of the classical thermodynamics should be moved to Entropy (classical thermodynamics).--61.141.152.67 (talk) 05:03, 11 January 2013 (UTC)
I agree with Zaereth and above. Have this as an introductory article for entropy in its many forms, links to specific subjects. Then go back an fix the links to this page if appropriate. PAR (talk) 07:55, 11 January 2013 (UTC)
Scholarpedia's article on Entropy is a good example of what an overall article introducing the many meanings of entropy could look like. Note that SP is licensed under a strict non-commercial license though, which is not compatible with WP, so we can't just copy the SP article over here either in whole or part.
Note that when this has been raised in the past, there has been a vocal lobby protesting that the primary meaning of entropy in the thermodynamic one, as e.g. often first encountered in high-school chemistry, or though discussions of the Second Law in mass-audience physics material. That in the past is what has led to this article having the scope it has (i.e. entropy in thermodynamics, with non-thermodynamic understandings of the term entropy handed off to other articles).
I'm not against re-scoping this article, but we should perhaps raise this to a formal RfC, give links to past discussions, and advertise it widely, including especially at WT:PHYSICS and WT:CHEMISTRY. Jheald (talk) 14:06, 11 January 2013 (UTC)
I've been giving this a lot of thought over the last few months. Perhaps a top-down approach is not the best way to tackle this. Maybe it would be better to try reverse-engineering, from the bottom up. The more I think about it, the more it seems like we need a fresh place to start, rather than trying to reword or clarify what is already there. I don't have much spare-time to actually sit down and work it all out at once. However, I've been thinking that, within the coming weeks, I would begin an introduction section in my sandbox. (Anyone who has seen my work knows I'm fond of intros.) Intros tend to give a little more latitude, rather than trying to cram it all in the lede, giving some room in which to tie it all together.
I like a good challenge, and this should surely present one. I am also excited because I see great potential for some serious collaboration here, and am happy that we have people representing all of the various aspects of entropy, such as Jheald, PAR, Count Iblis, Dolphin51... (Forgive me if I missed anyone.) I can't promise quick turn-around but, once I get something written, I hope I can get a some input from everyone. Thanks. Zaereth (talk) 22:00, 11 January 2013 (UTC)
I've started working on an intro-section, at User:Zaereth/sandbox. If anyone is interested in leaving some feedback, or helping to correct any mistakes, it would be appreciated. Zaereth (talk) 23:54, 18 January 2013 (UTC)
The initial paragraphs of your introduction confuse entropy with heat capacity. Entropy is not a rate: it is an integral. Heat capacity = dQ/dT but entropy = integral dQ/T . I have removed your first two paragraphs, which introduce the confusion.--212.3.132.195 (talk) 13:31, 27 January 2013 (UTC)

(Undent)Well, I was trying to avoid that confusion. I think you're right, though, that it did introduce some confusion. Perhaps describing it as an interval variable rather than a ratio variable is better. (I was thinking that, because both joules and kelvin are ratio variables, entropy must be a ratio as well.) However, as the new addition is written, it doesn't make much sense, especially to a newcomer. As PAR mentioned above, saying that entropy is an "abstract function of state" doesn't really say anything, and I believe this only adds to the confusion. The first sentence there should concisely state exactly what the math says entropy is. The difficulty lies in making the correct translation.

Entropy is definitely not an abstract thing, but a measurable property of heat, which I was trying to define from the macroscopic, thermodynamic standpoint first, before getting into other forms. To see this, perhaps it would be helpful to point out the difference between entropy and heat capacity. Heat capacity is the amount of energy that needs to be added to a certain amount of something to change its entire temperature a single degree. For instance, It takes a certain amount of energy to raise the temperature of a gallon of water a single degree.

On the other hand. entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only. Entropy does not deal with the heat capacity of the entire substance, but only with the energy needed to change (or "maintain" perhaps would be a better word) the temperature at the boundary where energy is being transferred.

In other words, as energy is added to the gallon of water, the temperature of the boundary does not change instantly. If it did, the energy and temperature would be equal, and the entropy would be nothing. Instead, if adding 1000 joules only increases the boundary temperature to 800 kelvin, then logic dictates that some of that energy is being used for something else. By dividing 1000 by 800, we get 1.25. If providing 800 degrees at the boundary is 100% of the energy needed to perform work, (in this example, performing work is simply heating the entire gallon one degree), then you will actually need to add 125% of the needed amount. The rest of that energy will not be used for work (temperature change), and will only be released as waste once the gallon of water cools. I think the main thing to understand is that entropy is not just something that occurs in machinery, but it occurs anytime heat transfers. Zaereth (talk) 01:01, 29 January 2013 (UTC)

"entropy is the amount of energy that must be added to something to change its temperature at the point of energy transfer only."
What you wrote makes no sense. The correct relation is that temperature is (proportional to) the amount of energy that must be added to change the entropy by a given amount.
"If it did, the energy and temperature would be equal, and the entropy would be nothing."
????? ? Entropy is not a form of energy. Nor is temperature.
You seem quite confused. Jheald (talk) 22:17, 29 January 2013 (UTC)
Ok, so the definition you're giving me for entropy is T=Q/dS. I don't doubt that your math skills are better than mine, but what confuses me is how a definition for temperature can be a definition for entropy. Zaereth (talk) 22:36, 29 January 2013 (UTC)
I've been trying to think about a good analogy when talking about things like temperature, entropy, or energy on the macroscopic scale. One that comes into mind is electrical energy (Watts per second). Electrical energy is defined by the parameters "power (W)," "amps (I)," and "volts (E)." The definitions of each of each of these parameters is: Power = W=IE, Amps = I=W/E, and Volts = E=W/I. None of these parameters are the same things, but they are all necessary dimensions of something called watt/seconds or "electrical energy." Similarly, the relationship between entropy, temperature, and thermal energy are all necessary parameters of something we, as a matter of convenience, call "heat." A quick look at a TS diagram can easily show the relationship. Zaereth (talk) 19:19, 30 January 2013 (UTC)
I'm not sure that's helpful (as well as being WP:OR. You can have entropy (and energy and temperature, for that matter) without having heat.
In response to your earlier comment, it's a change people come to as their understanding of the subject develops. So when people get their first thorough introduction to entropy -- e.g. perhaps in the context of the thermodynamics lectures of a first year undergraduate chemistry course -- then what seem natural are energy, and temperature that one can measure on a thermometer. These are the familiar safe everyday touchstones that one works out from; whereas entropy can seem nebulous, abstract, artificial -- a state function (as the syllabus may require one to prove), but one for which the meaning seems rather remote.
On the other hand, by the time one has moved on to statistical mechanics, entropy is something one can calculate, something that comes to seem completely instinctive (and also very fundamental); and then it is temperature which starts to seem the derived quantity, and in extreme cases not so easy to measure, unless one comes to define it as 1/(dS/dQ). Jheald (talk) 19:41, 30 January 2013 (UTC)
It seems like some of the later paragraphs of the introduction are not specifically about entropy. They mention entropy and talk about heat flow, but they do not prepare the reader to, for example, actually use or even become familiar a TS diagram. Explaining entropy should go beyond just reiterating the second law of thermodynamics. It should familiarize the reader with what it is like to solve problems with real entropy data. Even a high school student can perform a calculation with a TS diagram if properly instructed.--86.96.65.146 (talk) 20:37, 30 January 2013 (UTC)
Yes, and you can have volts and amps without having power. Forgive me for the OR, but I wasn't aware that I needed to cite talk page discussions. For a reference, the book Kinetic Theory and Thermodynamics says, "The quantity Q/T is a definite thermal property of the working substance and is called Change in entropy.... The entropy of a substance is a real pysical quantity like energy, pressure, temperature that can be measured in a laboratory.... From the above relation, we can say that the dimension of heat energy are the same as that of the product of entropy and absolute temperature." For a ref about electrical energy, see the book Basic Electrical And Electronics Engineering. I agree with you on the statistical side of things, but I fear beginning with that approach will lead to a "cannot see the forest through the trees" syndrome. Zaereth (talk) 20:54, 30 January 2013 (UTC)
It is a real physical quantity, but when it is "measured in a laboratory", what happens is that heat and temperature are more directly observed and then entropy change is computed.--86.97.247.44 (talk) 00:36, 5 February 2013 (UTC)
I understand that. Those words are from R. K. Agrawal, the author of the book. Personally, however, I'm beginning to agree with Hans Fuchs (The Dynamics of Heat: A Unified Approach to Thermodynamics and Heat Transfer): The problem appears to be one of cognitive linguistics. On the one hand, there is the language of math, and I think we all have an intuitive understanding of that language, (whereas others may speak it quite fluently). On the other hand, there is English which is far more complex, containing a much broader and subtlervariety of functions and relationships. The disjoint between math (thermodynamics in particular) and other forms of language seems to be due to the inability to translate an otherwise abstract math formula, and then project that translation metaphorically in standard cognitive imagery and standard cognitive dimensions. I understood this even before I read Fuchs' book, which is why I gave the above example about electricity (eg: amps and volts are easy to measure, but watts must be calculated). These same types of cognitive structures, expressions, and equations are found in all languages, and are almost always based on the relationship between at least three characteristics. (ie: This and that are those. These are those but not that, etc...) Personally, however, I've decided to just leave well enough alone until the linguistics is sorted out. My only suggestion is to include some real-world examples, which can be found in abundance, from cosmology to pattern recognition, if one really cares to look. Zaereth (talk) 04:10, 5 February 2013 (UTC)

(undent) Getting back to the topic of restructuring. It might be helpful to see what some of the other languages do on this topic.

But English currently has:

Perhaps we should restructure along the lines of the French and German.--190.73.248.92 (talk) 14:15, 2 February 2013 (UTC)

Most of the various other uses of "entropy" have been moved out of this article. The article is now focused better on just thermodynamic entropy. We may have to settle for that for now.--76.220.18.223 (talk) 22:54, 6 February 2013 (UTC)
Mostly very sound, I think.
But in my view the article should at least touch on the view that the entropy state function is simply the amount of information (in the Shannon sense) that would be needed to specify the full microstate of the system, that is left unspecified by the macroscopic description.
Many people find that this is the way they like to think about what entropy fundamentally is (although, equally, there are others that utterly dislike it). Jheald (talk) 23:10, 6 February 2013 (UTC)
On further inspection, that is still stated pretty much word-for-word in the lead; but perhaps there does need to be some more in the article to contextualise it. Jheald (talk) 23:13, 6 February 2013 (UTC)
Mostly restored the "information theory" section. Added your description as an introductory paragraph. It still seems a bit long.--200.246.13.126 (talk) 16:48, 7 February 2013 (UTC)

Specific entropy

I have removed the anchor for "specific entropy" and put sentence to define it in the lead. I have also updated the redirect specific entropy to just point to this article with no anchor. The anchor I removed did not make sense and seems to have been added by someone just searching for the first occurrence of the phrase, which turned out to be inappropriate.--178.210.45.18 (talk) 18:11, 6 February 2013 (UTC)

Peer review?

So what else does this article need before it is ready for peer review? One thing that would be nice is if we could have just one section that deals with the classical and statistical descriptions.--200.165.161.254 (talk) 00:01, 8 February 2013 (UTC)

Merged the two sections after the "History" section. I also added a "Function of state" section under "definitions". Much better outline: now, each element of the material has a natural home. It is really how these concepts come together in one obscure function that makes entropy difficult to grasp.--189.51.14.154 (talk) 14:17, 8 February 2013 (UTC)
Another thing about the style of the article: it really should be about the math i.e. geared for scientist and engineer student at university. That is the adult version of the article. Generalizations about entropy almost all belong rather in the section (or article) about the second law. So...what else does this article need before it might be considered ready for peer review?--77.122.164.30 (talk) 19:28, 11 February 2013 (UTC)
Um. Are you all the same person, that keeps changing IP (which seems to shift from continent to continent). Or were you three different people? It would be a bit helpful if you could get a user-id (or user-ids plural, if you're more than one of you), so the rest of us can know which of the edits were all made by the same guiding mind...
That said, I think the idea of peer review is a really good one, because no question this should be one of the most important articles under WP:PHYSICS, and in my view it simply hasn't been up to scratch.
I have to confess I haven't read through your edits in detail. But one key question that anyone taking on to reshape this article needs to come to a view about is what relationship it is going to have to all our other articles on entropy -- in particular to Entropy (classical thermodynamics), which I think is in reasonably good shape, in terms of being relatively accessible and not over-complicating things too early, but also covering a lot of good material. So wherever you take this article, it needs to have a distinctive role, that is different from any of the other articles we have on entropy. In the past, I think, it has been to outline the idea of entropy in classical thermodynamics, and also the idea of entropy in statistical thermodynamics, and to discuss how the two are compatible, while leaving quite a lot of each subject to be addressed in more depth by the more specific articles. So this article becomes a coat-stand, if you like, to hang various more detailed more specific articles off. I think that is how it has been in the past. But it might be objected that such a structure may not be perfect, because too many people may only get to this article, and not read things that would actually be valuable to them, because they never get to the sub-articles that would discuss them. But I think the real question is first to get clear to yourself what ideally you think the scope of this article should be for it to contain.
Assuming you get past that stage, something that almost all our Physics articles are very bad at compared to the rest of Wikipedia is referencing. (Perhaps because the ethos in Physics is so much to be able to work things through for yourself from first principles, if you really understand them.) For contrast, compare one of our few entries in Physics that is rated as having Featured Article status, namely Equipartition of energy, and see just how dense the referencing is. If you really want to take this article a step further, one of the things that could help would be to take some of the favourite classic textbooks on thermodynamics -- eg Reif, Kittel etc -- and add the relevant page references to each key idea or each section in this article. That also has the advantage of helping you to really scrutinise this article, and make sure that everything we say really does solidly reflect something presented in what we would consider an authoritative source. Working through the books may also have the flipside advantage of drawing your attention to perhaps anything we really ought to be discussing or pointing out in our text but maybe are not.
As for peer review, when you feel you've got the article into a state that you feel happy with, you could post at WT:PHYSICS or WT:CHEMISTRY or perhaps WT:AST asking for an informal internal review. Don't probably expect too much from this -- while there are some very high calibre people out there (particularly in the Maths project), most of us are just run of the mill editors, and it may be a long time since we studied thermo at Uni, or alternatively typical editors may be students who are still studying their way through their first degree -- either way typical editors probably don't have the grasp of detail that a really good peer review needs. But they can be very useful in telling you how your article reads to somebody not particularly initiated -- does it communicate the big ideas well, so they stand out? Does it flow? Does it avoid getting slowed down in unnecessary technicalities, that could be left until the reader has a better grasp of the overall shape of the topic? That's the sort of quality of input maybe to expect from such an informal internal review.
There is also WP's official assessment process -- how does the article compare with the criteria to be rated B, or A, or WP:GA, or WP:FA? From what I've seen this also may tend to be judged by generalists, who may be very hot on whether a particular sentence has a supporting reference link, or on whether the references are typeset just so, but who may be very weak on the actual science, and simply not qualified to spot whether the article may have glaring inaccuracies, glaring gaps, or material that simply isn't the best way to explain concepts.
So the scary one is asking for external peer review. This is probably best done when you have got very straight what intended scope you want the article to cover; got some informal internal reviewers to tell you whether they think it flows and develops reasonably, and when you've done quite a lot of work referencing it, and making sure it reasonably tracks the principal most authorititative sources. At that stage, it becomes a question of who you know (or who anybody at WT:PHYSICS knows) who actually teaches this stuff at a university, who could be persuaded to dig into it and give it a proper critique. It's a high bar to aim for, but if you pass that, you may be well on your way to getting it acclaimed as a featured article, and seeing your article as the big draw in the hot spot of the front page for a day. Jheald (talk) 22:46, 11 February 2013 (UTC)
If it could just get to be an A-class article or maybe a Good article, that would be nice. If it were in a state where a university student looking for answers such as: "What is it? How was it discovered?" would feel that they got those answers, at least, that would nice. The main changes made recently where to remove any suggestion that this is some sort of disambiguation page. This article is probably best named entropy (thermodynamics) since that is how the categories (and the French and German articles) are structured. Note that the first line of article has been "This article is about entropy in thermodynamics." The fate of the other two articles ( entropy (classical thermodynamics) and entropy (statistical thermodynamics) ) seems unclear since there is considerable overlap with this one. Look at the new section named "Entropy of a system" with material that is peculiar to neither definition. The choices seem to be: 1. Just give up and do nothing (except maybe just disclaim that the other two are specialized articles with a lot of overlap with this one.). 2. Merge the other two articles back into this one 3. Merge most of this article back into the other two. Any other viable options?--189.51.14.154 (talk) 23:29, 11 February 2013 (UTC)

(undent) The person who created the two sub-articles is this person:

The eoht.info web site (of which he is the owner and primary editor) is full of original research about trying to apply the math of thermodynamics to the human relationships between men and women and making babies, etc. There was technically nothing wrong with the sub-articles when he created them in 2006, but the whole idea of having both sub-articles was, I think, not well thought-out. I think that entropy (classical thermodynamics) should be merged back into this article because otherwise I do not think that we can provide any satisfactory explanation to the reader about what distinction we are trying to make in having both articles around.--200.109.197.133 (talk) 06:25, 12 February 2013 (UTC)

Merger proposal

The Entropy (classical thermodynamics) page is completely encompassed by this article. There is no sensible explanation about why we should have both pages.--200.109.197.133 (talk) 07:30, 12 February 2013 (UTC)

Proposal noted at WT:PHYSICS Jheald (talk) 12:53, 12 February 2013 (UTC)

*Support The reason seems apt. There may be other sections that can be factored out if the article gets too large.--Probeb217 (talk) 04:49, 13 February 2013 (UTC)

Vote struck out. Elockid (Talk) 04:58, 13 February 2013 (UTC)
  • Oppose. I've deliberately tried to give time to let other people come in here, and I'm but disappointed that they haven't, because this proposal deserves a fair hearing and fair consideration. It's not clear that we do have the scopes of the various articles on entropy correctly set up at the moment -- for example, the proposal above that the Entropy article, as the main entry point for people typing the word into the search box, ought to be something more like the Scholarpedia article [1], presenting an overview of all notionss of entropy, essentially replacing our current Entropy (disambiguation) article, rather than being focussed specifically on thermodynamic entropy.
I am not sure about that proposal, because in the past people have very strongly made the argument that it is thermodynamic entropy that people are most likely to be looking for, either from general interest about the nature of the universe, or because they're specifically meeting the concept in Physics, Chemistry or Engineering classes. As a result the argument is made that the WP:PRIMARYTOPIC for entropy should specifically be thermodynamic entropy. But that may or may not be right.
However, I am also uncomfortable about the current proposal. Because I think there is still a role for what was the original intention of this article, namely to be an entry-point for all ways of thinking about the entropy of physics, chemistry and thermodynamics -- so introducing the entropy of classical thermodynamics certainly, but also introducing statistical mechanical entropy, and the von Neumann entropy of quantum statistical mechanics, and showing how they (may) all marry up consistently with each other; as well as introducing how people think physical entropy should be talked about qualitatively -- particularly whether or not it's useful to talk about the log of the number of microstates as being predominantly determined by the extent of energy "dispersal", or the pros and cons of more traditional language involving "order" and "disorder".
I think that's a useful scope for the article to have, to give people as much of a handle on the question "So, what is entropy?"; though it's not an easy scope to do well. It's a rather different scope to trying to write a fully detailed article just about entropy in classical thermodynamics, which is what Entropy (classical thermodynamics) sets out to do. In particular, I think what the merge proposer writes, that the scope of that page should be "completely encompassed by this article" is probably not true. For the scope above to be possible, I think this article has to be written WP:SUMMARY-style, knowing that not all the detail can be presented here, that this article itself can only present a "taster" of a full proper treatment of each domain of thinking and calculating about entropy. The approach also seems to me very much in line with WP:TECHNICAL -- to try to give people as much as you can that is accessible as an overview of the whole subject first (the skins of an onion approach), rather than expecting to march the reader in order through the full detail of each domain one after the next.
So that's what I think had been the rationale between the article as it was (though I think it's proved a difficult brief to deliver). I still think it's a rationale that makes some sense. But if others want to argue why a different model for the article would be better, I'm not going to stand in anybody's way. (And sorry if I've been WP:TLDR for everybody). Jheald (talk) 11:46, 13 February 2013 (UTC)

Also oppose. The thermodynamic entropy article is, within its limitations, quite coherent, careful, and clear. The general entropy article has some nice attempts to include a more modern, general definition. However, the organization is a mess and various idiosyncratic and sloppy ideas dangle off various parts of it. Today I removed one section that was entirely wrong. Until this article is in better shape, I think it would be a shame to mess with the well-constructed thermodynamic article.Mbweissman (talk) 03:27, 8 March 2013 (UTC)

Delete first paragraph?

It seems to me the article might be better off without the first paragraph, which reads:

"Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy."

The article concerns thermodynamic, not specifically statistical mechanical, entropy, but the first sentence is more applicable to statistical mechanical interpretation of thermodynamic entropy than to the thermodynamic concept itself, which is worth understanding independent of the statistical mechanical accounts that may be given of it. It is also a specific, Boltzmannian attempt to give statistical mechanical interpretation to entropy, and may be at odds with more Gibbsian versions of entropy, so again, is probably best not to lead with. The validity of the second sentence is highly dependent on your definition of entropy, and again, it is probably best not to lead with it, but to discuss it later in the article. It is a reasonable point of view that isolated systems do not evolve toward thermodynamic equilibrium (and many attempts to prove that this is always so have failed), but rather that thermodynamic equilibrium tends to be reached through interaction with an environment.

If there's no strong objection within the next couple of weeks, I may give this a try, also checking to make sure that the points made in these sentences are addressed, with more nuance, later in the article.

The next paragraph introduces a thermodynamic definition of entropy, which seems a better starting point.

MorphismOfDoom (talk) 12:30, 5 June 2013 (UTC)

P vs. p: Power, pressure, probability

Today Koen Van de moortel changed one P to a p with the edit summary "P=Power, p=pressure". So I skimmed the whole article with the thought of making the notation uniform and found that

  1. power actually does not occur in this article
  2. pressure is sometimes P and sometimes p; and also
  3. probability is sometimes P and sometimes p.

So to be consistent we can choose Option 1 P = probability, p = pressure everywhere, or Option 2 P = pressure, p = probability everywhere. Opinions? Dirac66 (talk) 19:00, 21 May 2013 (UTC)

Yes, P for pressure and p for probability (Option 2) seems best to me but either seems reasonable. Go for it! MorphismOfDoom (talk) 12:32, 5 June 2013 (UTC)

I have now counted the uses of each symbol at the moment in this article. Pressure is P 9 times and p 4 times, and probability (by coincidence) is p 9 times and P 4 times. So option 2 recommended by MorphismOfDoom corresponds to what previous editors decided most often and I will revise the article to conform to that. As for other articles, their usage also seems to vary between P and p in an unsystematic way. Dirac66 (talk) 02:23, 10 June 2013 (UTC)
OK, now P is pressure and p is probability. Except that the section Entropy and other forms of energy beyond work contains two equations where P is pressure and p is ... momentum. So we have another reason to use P for pressure, to avoid confusion with momentum within the same equation. Dirac66 (talk) 02:45, 10 June 2013 (UTC)
Many classic texts use W for probability, this goes back to Boltzmann himself, since it stands for "Warscheinlichkeit". — Preceding unsigned comment added by 98.109.238.95 (talk) 22:08, 28 June 2013 (UTC)

biassed advocacy of "energy dispersal" point of view

The entire section on energy dispersal should be deleted. All but one of the sources it references are innapropriate for a wikipedia article since they are not authoritative or representative of a widely accepted approach. A retired professor's personal website should not be relied on. One stray article he wrote for an education journal, even though a respectable peer-reviewed publication, also is insignificant. An unsupported unsubstantiated and, I suspect from perusing google books, false statement about shifting trends in Chemistry textbooks is just too remote from the purpposes of this article. A direct quote from Atkins might be useful. This section is way too big to be in proportion to the importance of this hobby-horse point of view.98.109.238.95 (talk) 22:05, 28 June 2013 (UTC)

Keep that section. Even though it's a minority viewpoint, it should be covered, on the grounds that in Wikipedia, minority viewpoints get some coverage. See Neutral point of view for information on such policies.--Solomonfromfinland (talk) 00:24, 11 September 2013 (UTC)

Mystified

Hi,

I am mystified by this formulation (which appears to be often used in most unpredictable ways): "It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it." Are not order and disorder purely psychological (not physical) phenomena? - 92.100.165.149 (talk) 16:48, 13 December 2013 (UTC)

The word disorder has many meanings as listed on the disambiguation (dab) page Disorder. Entropy is related to the meanings discussed in the articles on Randomness and Order and disorder (physics), not to the various disorders in psychology and medicine. Dirac66 (talk) 03:27, 18 December 2013 (UTC)

edit about heat, work, and transfer of energy with matter

The edit summary, and references given, state the well established reasons for the edit that said "In the model of this present account, as shown in the diagram, it is important that the work and heat transfers are by paths physically distinct from the paths of entry and exit of matter from the system.<Born, M. (1949). Natural Philosophy of Cause and Chance, Oxford University Press, London, pp. 44, 146–147.><Haase, R. (1971). Survey of Fundamental Laws, chapter 1 of Thermodynamics, pages 1–97 of volume 1, ed. W. Jost, of Physical Chemistry. An Advanced Treatise, ed. H. Eyring, D. Henderson, W. Jost, Academic Press, New York, p. 35.>"Chjoaygame (talk) 23:28, 24 May 2014 (UTC)

Why is it important to have paths for heat and work different from those for matter transfer? Paradoctor (talk) 23:38, 24 May 2014 (UTC)
The edit summary, and references given, state the well established reasons for the edit.Chjoaygame (talk) 00:54, 25 May 2014 (UTC)
a) Why isn't this in the article?
b) If this is "well established", then there should be secondary sources instead of the primary sources provided. Considering the level of generality we're talking about, this should mentioned in the major textbooks, shouldn't it? Paradoctor (talk) 09:10, 25 May 2014 (UTC)
c) The Born reference does not seem to support the statement, could this be an instance of WP:SYN? Please provide a quote of the Haase reference supporting the claim, so we can verify. Paradoctor (talk) 10:52, 25 May 2014 (UTC)
Thank you for drawing attention to this.
The usual introductory presentations of definitions of work and heat for thermodynamics start simply by considering closed systems, that is to say systems for which matter transfer is not allowed. For this, Born was a main leader of the move to insist that heat be defined as a residual transfer of energy after mechanical transfer as work is calculated. No question is considered then of what would be so if matter transfer were allowed. Many texts are vague or uninformative about it. The question is considered at some length in the Wikipedia article on the first law of thermodynamics. Perhaps that may help. I am not sure that it is appropriate for me to try to summarize that here.
Born on page 44 writes "... thermodynamics is definitely connected with walls or enclosures. ... in free contact with its neighbourhood ... the flux of energy and material constituents through its boundary, which themselves cannot be reduced to mechanics." In an appendix, on pages 146–149, Born provides more detail. He presents a model in which transfers of matter and of energy as work pass through physically separate portals. Such a separation is also shown in the diagram for the model in the present Wikipedia article.
The present problem is to analyze a change of the entropy content of a system due to a thermodynamic process. It is due not only to entropy production within the system during the process, but also to transfer of entropy. The present article tackles the problem by considering transfers of energy as heat and work. In general, those quantities are defined only in paths in which there is no transfer of matter, as shown in the diagram in the article. In paths in which there is transfer of matter, they are in general not definable. That is what Born means when he writes "which themselves cannot be reduced to mechanics". Haase on page 34 writes: "On the other hand, work and heat are not functions of state; they essentially relate to interactions of the system with its surroundings. Hence these quantities as defined heretofore have no definite meaning for open systems. (cf. Defay (1929). See also Haase (1956a))." Haase goes on to point out "There are however, important exceptions: the external work Wa and the dissipated work Wdiss can always be calculated from external actions. If, for instance, there is flow of electricity .... ". These exceptions are for quantities of work that pass by paths separate from matter transfer. Haase then summarizes "But the total work done on the open system remains indefinite."
Born and Haase are not alone. For example, Münster writes on page 50 "As explained in §14 the classical point of view (equivalence of heat and work) as well as Carathéodory's point of view (definition of heat) are meaningless for open systems."<Münster, A. (1970), Classical Thermodynamics, translated by E.S. Halberstadt, Wiley–Interscience, London, ISBN 0-471-62430-6.> Further references about this are in the Wikipedia article about the first law.
There are those who feel that this is not a happy situation, and they offer arbitrary definitions that are not tied to the basic physics of heat and work. Such arbitrary definitions are not appropriate here, where the paths are shown to be separate in the diagram.
One needs to be careful in reading texts on non-equilibrium thermodynamics, where the word heat is used in the vague senses of the nineteenth century, when it was still ok to speak as if heat were found within a body like internal energy, rather than in the strict sense nowadays used that insists that heat is energy only in transfer. I will not here go into detail about this, beyond commenting that it does not override the above comments.Chjoaygame (talk) 23:22, 25 May 2014 (UTC)
I see no response to the immediately foregoing. I think it justifies removal of the tag [failed verification]. As for the tag [why?], I am not persuaded that a detailed reason for the tagged sentence is needed in the article. The basic reason why it is needed is that the formula that is offered applies only to a special model as set out in the section and illustrated in the diagram in the article. Without that special model, the formula has no meaning.
I am writing in the reference to the special nature of the model, which was deleted, I think inappropriately.Chjoaygame (talk) 04:55, 30 May 2014 (UTC)

The side bar formula on work and entropy change

I agree. But the assumption of maintaining equilibrium should have started on the first equation, not the second. Otherwise it looks to the casual reader like you missed the chain rule in calculus. — Preceding unsigned comment added by 2601:C:8D80:249:250:8DFF:FEB5:7FA4 (talk) 22:24, 31 May 2014 (UTC)

Thermodynamic Systems Decay

In the opening paragraphs the article refers to isolated thermodynamic systems "evolving toward equilibrium" and further down "progressing toward equilibrium" these statements are misleading in that they convey some idea of improvement. The correct term is decay (ref Peter Atkins - Creation Revisited). So I would like to suggest the revised wording.. "decay toward equilibrium" in all cases where this is the intended meaning. Vh mby (talk) 12:44, 18 May 2014 (UTC)

Does that not, in turn, convey some idea of deterioration? Wouldn't a more neutral term be "moving"? Although for me, "evolving" and "progressing" don't imply improvement necessarily, so I'm not sure there's a problem. W. P. Uzer (talk) 13:33, 18 May 2014 (UTC)
Sorry to burst your bubble here but of course decay means "some kind of deterioration".. That is the precise understanding of the Second Law which the article is supposed to convey! By declaring your personal ideas.. "evolving" and "progressing" don't necessarily imply improving you make the very point I raise.. (confusion about the meaning of the Second Law.. which is certainly not 'neutral') I suggest you READ Peter Atkins before you air your personal ideas as encyclopedic. Vh mby (talk) 03:03, 19 May 2014 (UTC)
Technical terms sometimes can carry unfortunate semantics when interpreted using their more everyday sense, and it is difficult to steer clear of this. I think it is clear enough from the context that such semantics do not apply, and would suggest that no attention be given to it from this sort of perspective. OTOH, in a technical sense, "decay" does carry with it the meaning of a natural unidirectional tendency or relaxation to a "final state" (even if only in the macroscopic sense), as with radioactive decay. In this sense, "decay" seems to me to be a better fit. —Quondum 15:50, 18 May 2014 (UTC)
The meaning of the word decay as commonly understood does not colour or confound the technical meaning in any way (unlike the current terms). The processes that produce observable decay are the very same the mathematics attempts to explain. That I would suggest is why P Atkins uses it so emphatically. The entry by Mr/s W.P.Uzer demonstrates, I would suggest the need for such a change. Vh mby (talk) 03:03, 19 May 2014 (UTC)
If (roughly speaking) most scientists say "decay", then I'm quite happy for it to be used, but the fact that one author uses it is not conclusive in my opinion. As I understand your objection to "evolve" and "progress", it is also just your personal view (that these words imply some kind of improvement) - my personal view that decay implies some kind of worsening is then equally valid. W. P. Uzer (talk) 18:19, 19 May 2014 (UTC)

Well now we know why the article got de-listed from "good", It has suffered some decay [1]. If budding contributors don't understand it is their responsibility to find appropriate references to support their opinions and at the very least read and understand the validity, in the scientific community, of those given, I would suggest they don't belong here. So I shall include the reference..(Done) Vh mby (talk) 01:35, 22 May 2014 (UTC)

Reference

  1. ^ dictionary.com "decay" verb "to decline in excellence"

It seems Waleswatcher considers himself above Wiki protocols and above published authorities on the subject.My Ref:Atkins, Peter, W. (1992). Creation Revisited: The Origin of Space, Time and the Universe. Penguin Books. ISBN 0140174257.{{cite book}}: CS1 maint: multiple names: authors list (link). So I would request Waleswatcher to state here in this discussion page an answer to the question; To what (state) do isolated thermodynamic systems evolve..? Please include your reference. Mike 01:47, 31 May 2014 (UTC)

I'm not sure who the above comment is from, but if you look at the "evolve" versus "decay" section of this page, you will see I gave many references, with quotes, all to references used in PhD level physics courses on thermal and statistical physics (and hence substantially more authoritative than a pop science book written by a chemist). To answer your question, thermodynamics systems evolve to the equilibrium state. Waleswatcher (talk) 00:17, 1 June 2014 (UTC)
Well I did not realize you created a whole new topic, sorry about that.. but it appears to me you are just further confusing the issue.. see below. vh mby 03:31, 16 June 2014 (UTC) — Preceding unsigned comment added by Vh mby (talkcontribs)

"evolve" versus "decay"

Regarding this choice of terms - the problem with "decay" is that it implies that something changes form, disappears, or otherwise gets reduced or eliminated. But in many circumstances that is not what happens. For example when two systems are put into thermal contact, heat flows between them until their temperatures are equal. Nothing is decaying (well I suppose you could say the temperature difference is decaying, but that's convoluted and unclear). Similarly, drop some ink into a glass of water and the ink will mix into the water, increasing the entropy, but again nothing is "decaying". So decay is not necessary, "evolve" is more neutral and more accurate. That point of view is shared both by references (for instance Kittel and Kromer never use the term decay in the many places they discuss the second law - their statement of it is simply that the entropy will most probably increase if the system is not in equilibrium) and other editors (W. P. Uzer above, and Chjoaygame here https://en.wikipedia.org/wiki/Talk:Second_law_of_thermodynamics ). Waleswatcher (talk) 16:06, 30 May 2014 (UTC)

If you don't understand the meaning of entropy or deliberately want to confuse people (as the article so admirably does) then you won't know what decays.. and from what you say this seems to be the case..
Entropy is a measure of DISORDER and in every example you site ORDER is what decaysMike 03:22, 16 June 2014 (UTC)

Here's another reference that uses evolve in exactly the same way as in the article, and never decay (this is just one I can easily link to - there are many more). P. 122: "The second law, as expressed in (4.13), is responsible for the observed arrow of time in the macroscopic world. Isolated systems can only evolve to states of equal or higher entropy." http://www.damtp.cam.ac.uk/user/tong/statphys/sp.pdf Waleswatcher (talk) 16:13, 30 May 2014 (UTC)

More references, just for fun.

People don't come here for FUN.. this is an encyclopedia not your playground Mike 03:22, 16 June 2014 (UTC)

From Kardar, Statistical Physics of Particles: "The time evolution of systems toward equilibrium is governed by the second law of thermodynamics.... Do all systems naturally evolve towards an equilibrium state?...What is the time evolution of a system that is not quite in equilibrium?... In contrast to kinetic theory, equilibrium statistical mechanics leaves out the question of how various systems evolve to a state of equilibrium." The term "decay" is never used in the context of the second law or entropy anywhere in the book; "evolve" is used throughout.

From Landau and Lifschitz, Statistical Physics: "...its macroscopic state will vary with time...the system continually passes from states of lower to those of higher entropy until finally the system reaches...". Again, no "decay".

From Huang, Introduction to Statistical Physics, explaining the second law in some examples of distributions of gas molecules: "...almost any state that looks like (a) will evolve into a uniform state like (b). But..."

Need I go on? Waleswatcher (talk) 17:03, 30 May 2014 (UTC)

You seem to think this site is the sole property of your favourite physicists and the place to champion your pet theory.. well its not. Genuine scientific enquiry doesn't care about your favourite theory or the battle you are with having with whoever (creationists in this case no doubt)
The dictionary usage and understanding of the word "evolve" is unequivocally "develop"..
"The Science Dictionary
To undergo biological evolution, as in the development of new species or new traits within a species.
To develop a characteristic through the process of evolution.
To undergo change and development, as the structures of the universe.
Example sentences
The world-and the employment marketplace- evolve and progress. Superbugs evolve when common bacterial :infections develop resistance to the
The letter urges regulators to help each firm develop a plan that would evolve . Our culture evolve s, :sometimes rapidly, and teaching styles with it--but"
The usage is common across every dictionary I have checked.. The point is in the encyclopedia articles must use words consistent which their most commonly understood dictionary definition. You have absolutely no special 'rights' here and it appears to me you not only don't understand entropy.. you don't know what an encyclopedia is..
There are over 450 people watching this site so can I get some educated comments here please.. It is precisely this sort of misinformation and distortion which destroys the clarity of such important articles Peter Atkins uses 'decay' because it is exactly what thermodynamic systems do. Mike 03:22, 16 June 2014 (UTC)
vh mby 03:57, 16 June 2014 (UTC)
Words often have somewhat different meanings in scientific contexts than in "ordinary" contexts - this applies equally to "decay" (which in commonly understood usage means to deteriorate or decompose) as it does to "evolve". Waleswatcher has given several examples (presumably there are many more) of the scientific use of "evolve" in this context; you seem to be relying on one single author who uses "decay". That being the case, at the present time, "evolve" seems to be the more appropriate choice. W. P. Uzer (talk) 07:52, 16 June 2014 (UTC)
Any one else..?? vh mby 08:42, 16 June 2014 (UTC) — Preceding unsigned comment added by Vh mby (talkcontribs) Sorry about the identity crisis I am having {vh mby = Mike = Gyroman} hope this works Gyroman (talk) 02:14, 17 June 2014 (UTC)
In physics, both "decay" and "evolve" get used with specific meanings. For example, the term "exponential decay" is well-established, and a web search on "decays towards" gives many hits, many of which relate to thermodynamics, but also in subjects such as engineering ("the voltage slowly decays to zero") and even biology (referring to population densities). One speaks of the time evolution of a system; this says nothing about where it is headed, merely that it may be changing as a function of time governed by physical laws. Thus the state a system might be said to evolve according to the Schrödinger equation. The term "decay" is used to mean something more specific: that a process of relaxation is occurring. Systems evolve over time, but properties like entropy may decay over time, so I suggest we choose a term according to the subject of the sentence. In short, I suggest saying "systems evolve [in time]" and "entropy decays [towards thermodynamic equilibrium]". I see a sentence that is in need of attention: "Historically, the concept of entropy evolved in order to explain". I'd suggest "the concept of entropy arose". I also see the use of the word "progress" in "systems tend to progress", which could be replaced. —Quondum 14:48, 16 June 2014 (UTC)
"properties like entropy may decay over time" and "entropy decays [toward thermodynamic equilibrium]" This, I am very sorry to say, reveals an astounding level of IGNORANCE of the very term we are trying to define. The fact of your confusion (as a knowledgeable contributor?) only serves to confirm my objections to the confusion caused by the muddled meaning of "evolve" in this context. Gyroman (talk) 02:14, 17 June 2014 (UTC)
I think there is a sign problem involved with using the word "decay". In ordinary language, the word "decay" suggests a quantity which decreases with time, which might be your examples of voltage or population density under appropriately defined circumstances. But of course the entropy of a system plus its surroundings increases with time, and I think this is the root of many people's reluctance to use the word decay in relation to entropy. Dirac66 (talk) 15:30, 16 June 2014 (UTC)
And so the confusion continues.. Gyroman (talk) 02:14, 17 June 2014 (UTC)
I think we should stick to language as used in the discipline (by which I'm implying that I disagree that your point applies). My suggestion should however break the deadlock: the actual examples in the article refer to a system, not to the entropy, wherever the term "evolves" occurs, and thus "evolves" should be retained there. So while I feel that the term "decay" is the correct one to use when one is referring to what happens to the entropy (i.e. one should not say that the entropy of a system evolves), I see no examples in the article to apply it to. —Quondum 17:23, 16 June 2014 (UTC)
Physicists publications are by definition biased.. (everyone is, actually). Encyclopedic content tries to overcome these biases by conferring with a broad cross-section of the relevant knowledgeable community. Those who write school text books (like P Atkins) are going to be scrutinized to a higher degree than those publishing books to make their own particular point to a limited audience. A lot of science is argued out between authors and positions change as a result. Ok more than one author is needed on my part.. Let me thus put to you all, a consensus of a highly reputable scientific community which has already been done for us. I refer to the Cambridge Encyclopedia (CE) fourth edition 2000.. In which it is stated..

"The main aim of the CE is to provide a succinct, systematic and readable guide to the facts, events, issues, beliefs, and achievements which make up the sum of human knowledge". Here of course "events, issues, beliefs etc" are simply factually reported. The CE is not misrepresenting beliefs as intrinsic statements of fact. Please don't get sidetracked on that.

It states "Entropy: In thermodynamics, a numerical measure of disorder; ... As a system becomes increasingly disordered its entropy increases. For example, the entropy of a system comprising a drop of ink and a tank of water increases when the drop of ink is added to the water and disperses through it, since dispersed ink is highly disordered. Entropy can never decrease, which in the ink-in-water example amounts to the observation that the particles of ink never spontaneously gather themselves back into a single drop." Before you start on the purist 'isolated system' requirement it is basically covered by the word "spontaneously" meaning 'of itself' or 'without external influence'. Perfectly clear to any reader.

The Wikipedia should of course go further and cover history, people, mathematics etc.. but must on no account cloud or confuse already accepted (by relevant scientific community) meaning as I would suggest is given in the CE. Your problem here is the meaning of the word "evolve". If you would not write "thermodynamic systems develop towards equilibrium" and you claim different meanings for the word 'evolve' depending upon the context then you are faced with the unavoidable consequence of defining 'evolve' in this context. Which implies equating the 'evolve' used here to "increase in disorder" which is by no means obvious. For that reason I suggest the better and more appropriate word is the one used in the text books "decay" meaning decay in state of order. The term 'decay' is further more closely related to the mathematical usage of decay as expressed for any single value, which in this case is a probability term, that a particular state exists. Gyroman (talk) 02:14, 17 June 2014 (UTC)

Ok done Gyroman (talk) 12:11, 18 June 2014 (UTC)

Would Mr W P Uzer care to define "evolve" in a manner clearly synonymous with what is known and accepted concerning any isolated system ie it will irreversibly undergo an "increase in disorder".! Gyroman (talk) 01:10, 19 June 2014 (UTC)

Gyroman (or Vh mby, or whatever your name is), "evolve" in this context simply means to change or develop from one form to another. That's how it's used in the references I gave, and that by the way is the dictionary definition in three places I just checked (not that the dictionary definition really matters - what matters here is the language reliable sources for this topic use). However I do agree with you that it's not an ideal term in that it might give the wrong impression to some people, but in my opinion (and it seems that of every other editor that has commented on the issue) "decay" is worse. Perhaps there is a third term that is better than both, but I can't think of one. Waleswatcher (talk) 05:59, 19 June 2014 (UTC)
My point exactly.. 'evolve' = change or develop (same as I said), is completely misleading in the context of trying to convey to ALL READERS the relentless downward trend in order which is the essential characteristic of every isolated thermodynamic system. Any reader checking the meaning of evolve, with its synonym 'develop' is going to get the wrong idea. (May I say just like you seam to have).. and so it seems our Mr Uzer is content with a consensus of ignorance.. Well this is not a matter of individual, clearly erroneous opinion, and to leave it without clarification is unacceptable in this context. Either define evolve as the opposite of develop or get rid of it.. Gyroman (talk) 12:45, 23 June 2014 (UTC)
Would you care to say why you consider 'decay' worse than 'evolve'..? Gyroman (talk) 12:52, 23 June 2014 (UTC)
Would you also care to state how your "language reliable sources" define 'evolve'..? Gyroman (talk) 12:56, 23 June 2014 (UTC)
"Evolve" just seems to be the word that most scientists use in this context, whereas "decay" isn't, nor does "decay" seem a particularly apt metaphor (for what the system does, that is; it may be a good word for what the order does, as I think has been pointed out). I would be quite happy if the whole thing could be rewritten in such a way that more people might understand it, and this might well be done without using the word "evolve", but simply replacing that word with "decay" doesn't seem to me to be helping: it makes it more confusing for most people who have some familiarity with other scientific writing on the topic, and at best no less confusing for the rest. Like I said at the start, if there really is a problem here, I would suggest using a simple word like "move" or "go". W. P. Uzer (talk) 16:02, 23 June 2014 (UTC)
Let me understand you clearly.. 'order' decays.. but the 'system' doesn't.!! There are two substantial problems with that..
(1) The absolute entropy of any system is based on one and only one variable a number. It doesn't even have units.. it's a probability! (the thermodynamic probability that the particular state it is in exists out of all possible states). As the single descriptor it can hardly be used to account for two different results?
(2) How can you describe what is happening to the 'system' as different to the primary effect of order decay in the system without even mentioning it?
You did say you wanted more people to actually understand this.. Well I would suggest confusion over the direction energy takes as measured by its entropy is at the very heart of the problem here. Equilibrium is the lowest state of any system by any descriptor you choose and "move" or "go" fail miserably to convey what is actually happening! But 'evolve = develop' is manifestly its inverse!! Please explain? Gyroman (talk) 08:15, 25 June 2014 (UTC)
You seem to want to make (metaphorical) value judgements about different physical states - ordered is "good", unordered is "bad", so to describe changing to a "worse" state you insist on using a term that implies worsening, and can't accept a term that (to you) implies improvement. But others, I suspect, have no such judgements in mind - they are happy to follow scientific convention in using the word "evolve" just to denote a spontaneous change from one state to another state, and might use "decay" to denote a decrease in some quantity (a system is not a quantity, though its order might be) or decomposition of some object (which is not what happens to the system here). I don't think this usage of "evolve" can lead to any particular confusion, as it is anyway stated in the same sentence that the "evolution" (change) is in the direction of thermal equilibrium, but using more everyday wording might make it more accessible to some readers. W. P. Uzer (talk) 11:20, 25 June 2014 (UTC)

We are not even on the same planet here.. HELLO! is there anyone out there who understands
(a) The purpose of Dictionaries
(b) The meaning of words
(c) What happens to thermodynamic systems.?
"good" and "bad" are not my words, "worse" is Waleswatchers word, mine was "improvement", as conveyed by the word "evolve", (almost universally given the synonym "develop" with clear examples showing development of new logical or structural elements), THAT IS NOT JUST MY "OPINION", and it is the OPPOSITE OF WHAT OCCURS to isolated thermodynamic systems! ORDER is not one of a number of "quantities" of the system it is the ONLY quantity of relevance to the word entropy.. and it goes in ONE direction, and that is most emphatically the opposite indicated by evolve. If you can't define evolve as you would like (its not your word to do that with), different to the dictionary then I say it is deceptive to persist with it and a major reason for the downgrade of clarity. (reader understanding) Gyroman (talk) 13:41, 26 June 2014 (UTC)
It's not my preferred meaning of evolve, it's what appears to be established scientific usage (as per evidence given a long time ago). W. P. Uzer (talk) 17:12, 26 June 2014 (UTC)

Note "system evolves" "system decays" The same holds on Google Web and Books, and also when adding any of the prepositions "to"/"into"/"towards". Paradoctor (talk) 01:38, 23 July 2014 (UTC)

asking for reliable source

I have asked for reliable sourcing for a statement in the article: "It has more recently been extended in the area of non-equilibrium thermodynamics."

An opinion is expressed by Lieb, E.H., Yngvason, J. (2003), The Entropy of Classical Thermodynamics, chapter 8, pp. 147–195, of Entropy, edited by Greven, A, Keller, G. Warnecke, G. (2003), Princeton University Press, Princeton NJ, ISBN 0-691-11338-6. They write on page 190:

"Despite the fact that most physicists believe in such a nonequilibrium entropy, it has so far proved impossible to define it a clearly satisfactory way. (For example, Boltzmann's famous H-Theorem shows the steady increase of certain function called H. This, however is not the whole story, as Boltzmann himself knew; for one thing, in equilibrium (except for ideal gases), and for another, no one has so far proved the increase without making severe assumptions, and then only for a short time interval (cf. Lanford 1975) ...)
"It is not clear if entropy can be consistently extended to non-equilibrium situations in the desired way. ..."

An opinion is expressed by Grandy, W.T., Jr, (2008), Entropy and the Time Evolution of Macroscopic Systems, Oxford University Press, Oxford UK, ISBN 978-0-19-954617-6, on page 153. He writes:

"A century later Jaynes (1971) demonstrated that the H-theorem, therefore the Boltzmann equation upon which it is based, cannot be generally valid, even for a dilute gas. ..."Chjoaygame (talk) 11:33, 15 July 2014 (UTC)
Mathematical treatise of physical systems quite often results in sets of equations to which there is no analytical solution possible.. 3D aerodynamics and aero-elasticity problems often end up that way. But that does not mean the physical situation cannot be described or measured in such a way as to give reliable conclusions about macro behaviour. The problem with your references doing admirable in depth mathematical treatment may be a bit like not seeing the forest for the trees. If probability is properly considered as the sole driving force behind the second law or entropy increase it is not too difficult to answer the question "What drives the direction of entropy change in non equilibrium, isolated thermodynamic systems?" The answer simply falls out of the question! Since by definition it is only average behaviour in macroscopic terms being sought.. the system must tend toward a more probable state. Heat must still move from hot to cold.. etc.
And by the way the reference you are really missing is one that states the isolated thermodynamic systems evolve and defines evolve as the opposite of its synonym 'develop'! Gyroman (talk) 01:24, 23 July 2014 (UTC)
Dear Editor Gyroman, one can see that you mean well here, but you have not come near to offering an adequately reliable source for that general claim about entropy.Chjoaygame (talk) 02:30, 23 July 2014 (UTC)

So what exactly is the quantity which increases according to the second law?

The above discussion is confusing to non-experts, because the second law was originally classical and is in fact frequently used to study non-equilibrium states. If Clausius and Kelvin said that the entropy of the (system plus surroundings) always increases for non-equilibrium processes, then they must have provided some definition of the quantity which increases. The two sentences questioned were Historically, the classical thermodynamics definition developed first. It has more recently been extended in the area of non-equilibrium thermodynamics. The second sentence has now been deleted by Chjoaygame, and perhaps it was not quite accurate or reliably sourced. But I think it would be useful to readers to provide a more accurate explanation of why entropy can be used to describe non-equilibrium states and processes. I think it has to do with considering reversible processes which are infinitesimally removed from true equilibrium states, but I won't try to write a complete (and sourced) statement because I am certain that Chjoaygame can write a more accurate explanation of this point than I can. Please. Dirac66 (talk) 20:42, 23 July 2014 (UTC)

"disorder" Gyroman (talk) 01:33, 6 August 2014 (UTC)

I thought you would never ask !

The entropy of classical thermodynamics is a state variable for the energy picture U = U(S,V,{Nj}), and a state function for the entropy picture S = S(U,V,{Nj}), of a thermodynamic system in its own state of internal thermodynamic equilibrium. For physical systems in general, the classical entropy is not defined.

A straightforward but perhaps long-winded, though very safe, statement of the second law might go as follows:

One may consider an initial set {i } of several thermodynamic systems each its own state of internal thermodynamic equilibrium with entropy {Si }. There may then occur a thermodynamic operation by which the walls between those systems are changed in permeability or otherwise altered, so that there results a new and final set {f } of physical systems, at first not in thermodynamic equilibrium. Eventually they will settle into their own states of internal thermodynamic equilibrium having entropies {Sf }. The second law asserts that

Every natural thermodynamic process is irreversible. 'Reversible processes' in thermodynamics are virtual or fictional mathematical artifices, valuable, indeed practically indispensable, devices for equilibrium thermodynamic studies. Mathematical artifices nevertheless.

Thermodynamic operations have been implicitly recognized, though not so named, since the early days. Kelvin spoke of "inanimate agency" : "It is impossible, by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects." Logicaly this implies that he contemplates "animate agency"; that means, in modern language, 'thermodynamic operation'. They are essential for thermodynamic reasoning.

A step to dealing with non-equilibrium problems is to consider physical systems near enough to being in their own states of internal thermodynamic equilibrium that one can take the entropy to be the same function of the same state variables as for equilibrium. This is an approximation that works very well for many problems, and much valuable work has been done with it. But for an article on entropy to allow it, without specific notice that it is an approximation, I think is loose.

A further step towards non-equilibrium thermodynamics is to try to work with a scalar entropy that is a function of an extended set of state variables, that for example may include fluxes and time rates of change or spatial gradients of classical state variables. This works for a wider range of problems. Again I think an article on entropy that intends to include this should say so explicitly, not just loosely imply it.

A further step towards non-equilibrium thermodynamics is to use a thoroughly non-classical extension of the concept of entropy, the multiple-time hierarchy of entropies. The two-time entropy is a function of a time interval, between the initial state and the final state. It provides a criterion for the expected direction of non-equilibrium processes, when the one-time entropy is not an adequate guide. But I would say that for Wikipedia it is to be regarded as research matter, and reliable sources for it are not too many. I think in this article it calls for explicit, rather than just vaguely or loosely implied, mention.

Loose statements that refer to changes in entropy in an isolated system I think are indeed loose statements, in an article on entropy. The classical entropy of a physical system not in its own state of internal thermodynamic equilibrium is not defined. The entropy of an isolated thermodynamic system in its own state of internal thermodynamic equilibrium does not change. Statements that refer to changes have many implicit but tacit presuppositions that are not likely to be apparent to readers not familiar with the subject.

Loose statements in this article can easily be used to support wild speculation, such as about the entropy of the universe, something that is hardly definable, and certainly not classically defined. How far do we want this article to supply such support?

Loose statements may be argued for because they 'help' readers who want a 'quick and efficient' glimpse of entropy. Perhaps. But are they really well served by inviting them to accept loose statements? And what about those who want to learn something factual and reliable?

It would not be easy to change the article to say this kind of thing.Chjoaygame (talk) 10:31, 24 July 2014 (UTC)Chjoaygame (talk) 10:42, 24 July 2014 (UTC)

Hm. I notice that some of these points are discussed in the article on nonequilibrium thermodynamics, so we could just add a link to that article for supplementary information instead of repeating everything.
However I think there is a simpler version suitable for this article. I have now consulted several undergraduate textbooks to refresh my memory, and they say that entropy is a state function which is defined for a reversible path as the line integral of dQ/T, and for an irreversible path as having the same value as for the reversible path. So ΔS is operationally defined for any system as
We do of course have this equation in the intro already, but I believe we should add that since S is a state function, it can be also used to evaluate ΔS for irreversible (nonequilibrium) changes if the integral is evaluated for a reversible path. We could also point out that for both equilibrium and nonequilibrium, this classical definition only gives ΔS and not an absolute S. Dirac66 (talk) 16:26, 24 July 2014 (UTC)
I am not proposing to try to say these things in the present article, and I am not suggesting that we should link this article to the one on non-equilibrium thermodynamics, which is rather messy. I am just saying that the present article might benefit from more caution in what it suggests of the universality of entropy as a guide to physics. I don't have any particular proposed changes to the article in mind right now.
The worry is not about irreversible processes. Physically, thermodynamic processes are all irreversible. Only mathematical virtual "processes" can be reversible. The worry is about processes that do not start and finish in states of thermodynamic equilibrium. Processes that exactly start and finish in states of thermodynamic equilibrium are found in laboratories, but not so often outside them.Chjoaygame (talk) 17:51, 24 July 2014 (UTC)
Yes, I realize that reversible processes and also equilibrium initial and final states are only idealizations which can approximate reality. This is of course true of many scientific concepts - for example pure substances do not exist as any analytical chemist will tell you. But all of these idealizations are still useful approximations to reality in favorable cases. Dirac66 (talk) 02:40, 26 July 2014 (UTC)
Of course you are right there.
I was just worried that the reader might get the impression that the way from classical entropy to a hoped-for thermodynamics of general physical processes was all plain sailing, guided by the Admiralty charts.Chjoaygame (talk) 02:56, 26 July 2014 (UTC)
So what is the answer to the original question..? Gyroman (talk) 01:43, 6 August 2014 (UTC)
  • In my opinion, the essential point is what I have described above as a simpler version suitable for this article. So I added the following words a few days ago, with sources given in the article: To find the entropy difference between any two states of a system, the integral must be evaluated for some reversible path between the initial and final states. Since entropy is a state function, the entropy change of the system for an irreversible path will be the same as for a reversible path between the same two states. However the entropy change of the surroundings will be different. I followed Chjoaygame's advice and did NOT link to the article on Nonequilibrium thermodynamics. As for the point about ΔS versus S, I realized that this is already in the article elsewhere. Dirac66 (talk) 02:04, 6 August 2014 (UTC)

Classical vs Quantum

Why exactly is there a need for a citation for the statement - in the definitions section - that quantum statistical thermodynamics came after classical statistical thermodynamics? This is a well known historical fact. — Preceding unsigned comment added by 80.45.182.9 (talk) 14:02, 11 August 2014 (UTC)

Edit summary. Anyway, since it is "well known", it should be easy to state the WP:OBVIOUS. It's not as if a WP:CHALLENGE needs more than "I don't see it" to be valid. Paradoctor (talk) 18:04, 11 August 2014 (UTC)
Unsigned IP editor 80.45.182.9 refers to a request for a reference to the sentence "Later, thermodynamic entropy was more generally defined from a statistical thermodynamics viewpoint, in which the detailed constituents — modeled at first classically, e.g. Newtonian particles constituting a gas, and later quantum-mechanically (photons, phonons, spins, etc.) — were explicitly considered."
The problem with that sentence is that it is chatty to the point of near meaninglessness.
Editor 80.45.182.9 reads it as meaning "that quantum statistical mechanics came after classical thermodynamics". Editor 80.45.182.9 says it is a well known historical fact, and Editor Paradoctor comes near to saying it is obvious. If that were its meaning, it might be useful to change it to say just that in so many words. Or to delete it because it is superfluous or hardly relevant.
But it introduces other ideas as well, (1) "more generally defined" (not only due to quantum theory), and (2) the change from the macroscopic to the microscopic definition.
As to the notion that the microscopic definition is "more general" than the macroscopic definition. One may wonder what that notion really means.
I think the sentence is of poor quality and should be deleted as verging on the meaningless. Perhaps something better could be put in its place.Chjoaygame (talk) 02:09, 12 August 2014 (UTC)

notation

An unsigned edit has adverted to the notation for differentials here.

Sometimes a notation đQ has been used to denote an incomplete differential (e.g. Kirkwood & Oppenheim 1961, Tisza 1966, Callen 1960/1985, Adkins 1968/1983). Some authors use the notation q (Pippard 1957/1966, Guggenheim 1949/1967). Landsberg uses the symbol d'Q, perhaps because of the fonts available to him. I do not know how to put đQ into LateX; perhaps someone will kindly enlighten me.

Some authors do not mark the incompleteness by a special symbol (Born 1949, Buchdahl 1966, ter Haar & Wergeland 1966, Münster 1970, Kondepudi & Prigogine 1998, Bailyn 1994, Tschoegl 2000).

Often in Wikipedia the same object seems I think (subject to correction) to be denoted δQ; in such notation, that object is not a finite difference; in such notation it is an infinitesimal. Perhaps someone will check this out.

The quantity on the lefthand side of the equation in question is on the other hand a finite difference, not a differential. It is customarily, as in the lead of the article denoted by a non-italic font capital delta, ΔS, not an italic font lower case letter δS.

The word infinitesimal is so spelt, not as "infinitessimal". The word an is so spelt, not as "and".Chjoaygame (talk) 05:09, 12 August 2014 (UTC)

đQ is not given at WP:MATH and may be unavailable in Wikipedia LaTeX. I checked the External Link named The Comprehensive LaTeX Symbol List and found on p.12 that the required markup is \textcrd. However when I tested it here, it failed to parse so presumably it is unsupported on Wikipedia. Dirac66 (talk) 12:21, 12 August 2014 (UTC)

Failed to parse (unknown function "\textcrd"): {\displaystyle \textcrd}

Thank you for that. It is as I suspected. I am guessing that Wikipedia uses the notation δQ or δQ because it is desired to explicitly mark the incompleteness, but that cannot be done with đQ in LaTeX in Wikipedia? So an alternative is used?Chjoaygame (talk) 01:15, 13 August 2014 (UTC)Chjoaygame (talk) 01:17, 13 August 2014 (UTC)