Talk:Entropy/Archive 9
This is an archive of past discussions about Entropy. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 5 | ← | Archive 7 | Archive 8 | Archive 9 | Archive 10 | Archive 11 | → | Archive 14 |
GA Sweeps (on hold)
This article has been reviewed as part of Wikipedia:WikiProject Good articles/Project quality task force in an effort to ensure all listed Good articles continue to meet the Good article criteria. In reviewing the article, I have found there are some issues that may need to be addressed, mainly whether having a separate "Introduction to" article is a sufficient replacement for writing at a general level. In my view it is not, but I'm happy to be convinced otherwise, or post the article at Good Article Reassessment for other opinions.
There are a few other fixes listed below that are needed to keep Entropy at a good article standard.
GA review – see WP:WIAGA for criteria
I was initally of two minds about using an "introduction to" article as a general audience work around. On the one hand, yes, it is a very good way to avoid the difficulties inherent in explaining advanced concepts at a basic level, but on the other hand, it bears similarities to a "criticisms of" POV fork. After reading through the lead of the article, I think there's a good reason to merge the "introduction to" article into this one: when writing for a general audience, one tends to focus more on clarity than absolute precision, and it is this clarity that is crucial for the lead of this sort of highly techinical article. Also, most of the sections in this article are summaries of other articles, but still focused at a very high level. It would be better if the summaries were geared to a general audience and the high level material left for the main articles.
- Is it reasonably well written?
- A. Prose quality:
- The lead could be made clearer and more focussed, especially the first paragraph. See comment after review.
- B. MoS compliance:
- There are some Manual of style problems in addition to the general audience discussion above. They're only small errors, but there are quite a few a of them. For example, remember to avoid slipping into textbook style and using "we" in derivations, and that punctuation after a math tag must go inside. See Wikipedia:Manual of Style (mathematics).
- A. Prose quality:
- Is it factually accurate and verifiable?
- A. References to sources:
- B. Citation of reliable sources where necessary:
- There are a number of unsourced facts, especially in the history section. The GA criteria specify that at a minimum, every statement that could be contested must sourced with an inline citation.
- C. No original research:
- The ice melting example has to count as original research unless it is sourced. As it's apparently used in many textbooks, this won't be hard to fix.
- A. References to sources:
- Is it broad in its coverage?
- A. Major aspects:
- B. Focused:
- A. Major aspects:
- Is it neutral?
- Fair representation without bias:
- Fair representation without bias:
- Is it stable?
- No edit wars, etc:
- No edit wars, etc:
- Does it contain images to illustrate the topic?
- A. Images are copyright tagged, and non-free images have fair use rationales:
- B. Images are provided where possible and appropriate, with suitable captions:
- A. Images are copyright tagged, and non-free images have fair use rationales:
- Overall:
- Pass or Fail:
- Pass or Fail:
Regarding the lead, it contains 7 different definitions of entropy:
- [it] is a measure of the unavailability of a system’s energy to do work.
- is a measure of the randomness of molecules in a system.
- entropy is a function of a quantity of heat which shows the possibility of conversion of that heat into work.
- [is] the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state
- [it has] often been defined as a change to a more disordered state at a molecular level.
- has been interpreted in terms of the "dispersal" of energy.
- is defined by the differential quantity dS = δQ / T
While I'm sure they're all true, it makes the lead seem very cluttered, and I come away with a cluttered idea of what entropy is. I think that if the lead were refactored such that each paragraph had a clear, single focus, it would improve the article dramatically.
Feel free to drop a message here if you have any questions, and many thanks for all the hard work that has gone into this article thus far! --jwandersTalk 03:58, 19 February 2008 (UTC)
Delisted from GA. See review directly above. --jwandersTalk 22:28, 20 February 2008 (UTC)
other sections on entropy: economics, and as metaphor
There should be some other sections on entropy as it appears in macroeconomic theory, and as it has been used outside of science as a metaphor.
The first: I do not mean the technical (information theory) use of entropy as a measure of this or that quality of information generated about one or another economic variable. I mean: there's a revolution happening in economics, as the old Newtonian mechanism is being replaced by a macro perspective that understands that economic processes are one-way flows that happen in time, in which scarce (low entropy) matter and energy is taken up into the economy and degraded (high entropy) matter and energy is extruded/exhausted/discarded. I wrote a 'graph on this that seems to have disappeared. Maybe I didn't follow directions or rules?
The second: the idea of entropy has been widely used by poets, historians, fiction writers, thinkers of all kinds, some of whom understood it, some of whom didn't. Still, a comprehensive article on the subject could give a brief survey of these uses. I wrote some text on this, too, and it isn't there today. Problem? —Preceding unsigned comment added by 128.252.254.30 (talk) 19:24, 1 March 2008 (UTC)
Lede context
Hello,
A few questions, which I post without any
- is thermodynamics a branch of physics? I always thought of it as a branch of chemistry - particularly the statistical mechanics considerations, though I can see how it could go either way.
- Is entropy really at a purely thermodynamic property? I would have thought that entropy is a statistical property which finds use in fields such as thermodynamics, information theory, etc.
Maybe I am just favouring statistical mechanics... Thanks User A1 (talk) 22:59, 25 March 2008 (UTC)
- Ad 1. Our Thermodynamics article starts like this: "Thermodynamics (...) is a branch of physics that studies the effects of changes...". The article History of entropy described how the notion was already well developed (see also Classical thermodynamics and Entropy (classical thermodynamics)) before the statistical explanation was developed (see Statistical thermodynamics and Entropy (statistical thermodynamics)).
- Ad 2. Entropy is not a purely thermodynamic concept, although it originally was, and the statistical definition used in thermodynamics is specific to that field. However, as it is, it is the thermodynamic concept that is described by this article. I am in favour of renaming this article Entropy (thermodynamics), a name that currently redirects here, as does Thermodynamic entropy. See also the discussion raging at Talk:Entropy (disambiguation). --Lambiam 21:40, 26 March 2008 (UTC)
Requested move
Entropy → Entropy (thermodynamics) — The article appears to discuss thermodynamics only, and fails to review entropy in other branches of physics, information science and mathematics. —linas (talk) 04:14, 27 March 2008 (UTC)
Once again, the stupidity of masses rears its ugly head, as the above exhibits in spades. At the risk of being uncivil, I say "fuck wikipedia". If this is what the cornhole editors with their heads stuck up their asses want, this is what they get. linas (talk) 02:16, 8 April 2008 (UTC)
Error in Explanation
'then entropy may be (most concretely) visualized as the "scrap" or "useless" energy'
Usually in an article discussing a useful combination of more basic physical quantities, the units of the item are given. In this article they are not explicitly covered. Big mistake. And it leads to incorrect statements like the one above. Entropy is not energy. The term energy has a whole lot of baggage that comes with it, and to suggest that entropy carries the same baggage (say like conservation) contributes to a gross misunderstanding of what is going on. I hope authors/editors will be much more careful. Properly presenting the ideas of physical chemistry requires much more rigor than present in this article. blackcloak (talk) 05:32, 7 June 2008 (UTC)
- Thanks for the comment, this article has been subject to a sort of tug of war between various perceptions of how to explain a difficult concept involving advanced mathematics in a simple way accessible to the layman. This earlier version was edited by an educator, and may be nearer what you were looking for. The article's gone through numerous intermediate stages, as in this version, and the lead has been stripped down to the point where it's probably missing out on essentials while still including misleading cruft. Rather beyond me, but your assistance in a rewrite will be greatly appreciated. Note, of course, that thermodynamic entropy applies to more than physical chemistry. . dave souza, talk 08:14, 7 June 2008 (UTC)
- Well, today the average lay person is much more familiar with information theoretical concepts because many people have a computer these days (certainly those people who visit this page :) ). So, we can exlain the rigorous formulation much more easily than, say, Landauer could half a century ago. Why can't Maxwell's demon be effective? Today that's almost a no brainer to a ten year old. Count Iblis (talk) 13:29, 7 June 2008 (UTC)
I rarely have time these days to think about this article, but I want to make a comment in response to blackcloak. I suggest that the urge to follow "the ideas of physical chemistry requires much more rigor than present in this article" does more harm than good and probably explains why students in physical chemistry never really understand entropy. What is needed at least at first is not rigor but clarity, so the reader can see what entropy is actually about and why they need to learn about it. Rigor can follow later. I am not of course suggesting that the "clarity" phase should be false, but it does not need to be rigorous. It also needs to take into account that many students have a poor background in mathematics. --Bduke (talk) 22:36, 7 June 2008 (UTC)
- On the other hand, claiming entropy is "scrap or useless energy" is not clear, and is not good. It does not help understanding if entropy is confused with energy. The unusable energy is TR S, where TR is the temperature of the coldest accessible reservoir. Jheald (talk) 20:08, 9 June 2008 (UTC)
- I entirely agree. My concern is the general urge for total rigour that can make the article totally unclear. --Bduke (talk) 02:29, 10 June 2008 (UTC)
- Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)
- It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)
- Well, the problem is that people are taught thermal physics in the wrong way in high school. At university part of our work is to let the students unlearn what they learned in high school. Entropy is fundamentally an information theoretical or statistical concept, just like heat, temperature etc. are. If we just say that entropy is related to heat and temperature, we aren't explaining anything.
- It is totally confusing to someone who comes across the term "entropy" in connection with chemistry. But even wider, where on earth do you get the idea that it "is perfectly understandable to most lay people"? Can you back that up with a source.--Bduke (talk) 23:01, 10 June 2008 (UTC)
- Why don't we say that the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state? This definition is perfectly understandable to most lay people. Count Iblis (talk) 02:45, 10 June 2008 (UTC)
- I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)
- I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)
- We do have to explain everything from the start to the students. I don't really believe that genuinly interested people who are willing to learn can fail to underststand something as simple as entropy. But they do have to be open to the idea that their intuitive ideas about entropy, heat and temperature may be wrong.
- I do not know where you come from, but most of the students I have taught physical chemistry to in the last few decades had not even studied physics at High School and had not come across entropy before. I see you are a physicist. It seems to me that what is familiar and obvious to your students is very far from being so to everyone else. I give up. I just do not see a way forward. This article will continue to be dominated by physicists and it will continue to be totally unclear and unhelpful to anybody else. --Bduke (talk) 23:17, 11 June 2008 (UTC)
- I'm not saying that we should explain everything mathematically, that's not necessary. Compare e.g. how Chaitin explains Gödel's theorem to lay people. The information theoretical approach clearly works very well here, precisely because lay people are familiar with computers, computer memory etc. etc.. Count Iblis (talk) 21:23, 11 June 2008 (UTC)
- The reason why people find physics difficult is because we don't teach it properly until the students go to university. Just think about how well you would read and write English if you were not taught to read and write until you were 18 years old. Now, if our reaction to this problem is to dumb thing down even more we are only going to make the problem worse. We have to keep in mind that wikipedia is also read by many children in primary and high school. They would benefit from being exposed to real physics instead of the dumbed down physics stuff they are taught in school. Count Iblis (talk) 02:40, 12 June 2008 (UTC)
- BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)
- I actually did that long ago, but under a different title which I forget. I deleted it. It lead to a rewrite of the intro para as Dave Souza mentions in the second para above in this section. I had other things to do and it just reverted back to where it is now. It is just too hard unless others recognise that we do have a real problem with this article and many others. I just do not have the time to fight this alone. --Bduke (talk) 01:01, 12 June 2008 (UTC)
- BDuke, Just thought I would wade in here. If you want to make progress, one suggestion would be to create a page in your own user-namespace, eg User:Bduke/entropy and then use that to construct, what you believe to be a good modification - that way you can actually point at something and say "this is a better explanation; what do you think" rather than "Currently the way the article does is wrong, we should do it a better way". More likely you will get a more enthusiastic response from other editors. see WP:BOLD. On the other hand, this is more work :( - Can't win em all, huh?User A1 (talk) 00:40, 12 June 2008 (UTC)
Defining entropy as the maximum amount of information you could theoretically store in the system without affecting its macroscopic state is not understandable to most lay people IMO. That definition only makes sense if the reader is acquainted with a quite technical meaning of "information", which takes the reader who doesn't know it in a nearly circular path of confusion. It is also counterintuitive to suggest that a gas "holds" more information than a solid, for example. What do you mean by "hold"? Why are hard drives not gaseous then? ;-) Like I suggested above already, I think it is best to start with a clear and unambiguous definition such as [5], even if it doesn't explain what entropy is good for or what it "is". The analogies and examples can come later. --Itub (talk) 08:54, 12 June 2008 (UTC)
Once more into the lead, dear friends
As a layman, my opinion is that the current lead section has some problems. In the opening sentence – "In thermodynamics (a branch of physics), entropy is a measure of the unavailability of a system’s energy to do work." – "(a branch of physics)" is superfluous and misleading as it's also a branch of chemistry and mechanical engineering. Best explained in more depth later.
"It is a measure of the randomness of molecules in a system" is completely baffling to me. The jump from availability of energy to do work to "randomness" of molecules makes no sense. I'd be much happier with modified borrowings from the introduction to entropy article, explaining what it is and how it's measured in statistical terms. Thus, a proposal, concluding with the statistical and information meanings. The derivation of the term adds nothing to basic understanding, and so should be moved to the body of the text. –
- In thermodynamics, entropy is a measure of the unavailability of a system’s energy to do work. It provides a measure of certain aspects of energy in relation to absolute temperature, and is one of the three basic thermodynamic potentials: U (internal energy), S (entropy) and A (Helmholtz energy). Entropy is a measure of the uniformity of the distribution of energy. In calculations, entropy is symbolised by S and is a measure at a particular instant, a state function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Often change in entropy, symbolised by ΔS, is referred to in relation to change in energy, δQ.
- It is central to the second law of thermodynamics and the fundamental thermodynamic relation, which deal with physical processes and whether they occur spontaneously. Spontaneous changes, in isolated systems, occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed.
- Statistical mechanics introduces calculation of entropy using probability theory to find the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. Statistical mechanical entropy is mathematically similar to Shannon entropy which is part of information theory, where energy is not involved. This similarity means that some probabilistic aspects of thermodynamics are replicated in information theory.
The question of use in physics, chemistry and engineering is covered by the mention of temperature, pressure, density, and chemical potential, in my opinion. To refer to the previous discussion, saying that "the entropy of a system is the maximum amount of information you could theoretically store in the system without affecting its macroscopic state" is meaningless to me. If it has some deep meaning not covered by the last paragraph, perhaps that explanation could be simplified and added. . . dave souza, talk 09:30, 24 June 2008 (UTC)
- Not so bad as a starting point, but I do have some issues:
- a measure of certain aspects of energy isn't good. It's vague to the point of being confusing, rather than clarifying. (1) It's too abstract. Entropy is a particular property of particular physical systems. On the other hand certain aspects of energy suggests (to me) certain characteristics of energy such as eg that energy can't be created or destroyed. But what would be a measure of that? So I find the phrasing unintuitive and confusing. (2) It's not just energy that entropy is associated with. You can know things about the system that change its entropy, without changing its energy distribution.
- Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Entropy isn't energy. And the relationship is a differential one, because adding heat to a system will generally change its temperature. Furthermore, Q is not a state function. (Incidentally, a measure at a particular instant is a curious way to define a state function; though I think I see what you're getting at).
- the number of possible microstates at an instant, any one of which will contain all the energy of the system at that instant. What you mean, I think, is that a microstate contains all the information of the whole system; so defines exactly how all the energy is distributed across the whole system at that instant. But what's written is very unintuitive. I fear anyone who doesn't already know what a microstate is will inevitably think of all the energy piled up in one place.
- The calculation shows the probability, which is enabled by the energy: in terms of heat, by the motional energy of molecules. This is gibberish. What is it trying to say?
- As for entropy as a measure of the freedom/randomness/uncertainty/information of a system. This isn't meant to follow intuitively from the "unavailability of a system's energy to do work". Rather, it's a deeper, contrasting, microscopic view of what entropy fundamentally is. The laws of physics are deterministic, and they preserve uncertainty. So if you're pushing heat through a heat engine, there has to be at least as much uncertainty (entropy) associated with it at the exhaust stage as at the input stage. If the exhaust is at a temperature TR, that means an energy of at least TR S must be dumped as waste heat, if S was the entropy put in at the input stage.
- That's why saying the freedom/randomness/uncertainty/information can't decrease is equivalent to there being a limit on the availability of the system's energy to do work. Jheald (talk) 10:41, 24 June 2008 (UTC)
- Perhaps it would help if we don't stick to the wiki-convention of defining the term in the first sentence. We could just say in the first sentence that entropy is an important quantity in thermal physics, and then give a general introduction of thermal physics. We should not mention entropy again until we are ready to give a precise definition that is understandable to the reader who has read the text till that point. Count Iblis (talk) 13:32, 24 June 2008 (UTC)
User Comment
I have a B.S.in physics, and I am studying for the subject test, so I use Wikipedia a lot. I really appreciate how useful Wikipedia is. I have a collection of texts that I have either saved from undergraduate classes or purchased here and there, but the articles on Wikipedia usually clarify some otherwise mystifying points about certain topics. While I deeply appreciate all the efforts that everyone has made to present a valid, understandable presentation, there are two things that concern me. One is about the contributors to Wikipedia in general, and the other is about this article in particular.
First, I know that one of the specified conditions is that people be courteous to each other, but they are obviously not. This disturbs me on several levels. It disturbs me on a casual level, just because it is so distracting when I'm trying to study. When I run into all this rude stuff that people are writing to and about each other,it makes me want to just shut it down. On a more professional level, it seems to reflect the general lack of respect that scientific types seem to have for each other. I don't know if it's because of competitiveness, or because we spend to little time interacting with people, that we don't learn to treat others the way we want them to treat us. I think it's some of both. Regardless, I wish you peopole would be nice.
Now, about the article. Somebody break out the smelling salts. I'm interested in the concept of entropy as it pertains to physics. It is distracting to include the informational aspects of the concept mixed in with an article about physics. The two should be separate. At the beginning, there should be a link for people who are interested in the informational aspects of the concept. Just because both uses of the term are equally valid and important doesn't mean that they have to be mixed together in the same article. I have seen countless other articles where this problem has been addressed very effectively. The reader is directed to the article about the usage that he or she is interested in. If one so desires, he or she can go back and read the other article. Putting the information in separate articles doesn't slight one discipline or the other. Putting them together is not efficient for the reader. I don't know why there needs to be so much debate about it. It seems clearly disorganized to mix it all together. Thank you for all your hard work. Is there enough bread around my compliment, critique, compliment sandwich? —Preceding unsigned comment added by 98.163.102.226 (talk) 17:35, 8 July 2008 (UTC)
Entropy and Information theory
On 9 July 2008 the following text was added by Kissnmakeup. These comments are not appropriate for the article Entropy, but are most appropriate for this Talk page. The text was deleted by Jheald on 9 July.
- This does not belong in this article. This article is about thermodynamics. There is a link at the top of this article for readers who want to know about information theory. The person or people who insist that this should be here need to reconsider out of courtesy and common sense. Both uses of the term should have their own articles. There is no reason why an article about thermodynamics should include a discussion about information theory.
I am restoring the text, this time to Talk:Entropy, for the benefit of those who regularly work on Entropy. Dolphin51 (talk) 03:00, 10 July 2008 (UTC)
- Fair enough. I think the section at the moment clearly isn't explaining well enough what I also wrote in my edit summary, viz. that many people do find this interpretation very useful as a way of understanding *thermodynamic* entropy.
- Specifically, (as the Maximum entropy thermodynamics page puts it) that statistical mechanics should be seen as an inference process: a specific application of inference techniques rooted in information theory. And that thermodynamic entropy is exactly the same thing as information entropy - it is simply an application of information entropy to a particular situation, and a particular framework of questions of interest.
- So thermodynamic entropy represents the amount of information you do not have about the exact state of a system, if all you know are the values of the macroscopic variables.
- That explains why learning information about the system (eg as in the Szilard engine thought experiment) can change its entropy, i.e. allow you to extract additional useful work, even if nothing has changed about the system itself.
- It also, for many physicists, answers the metaphysical questions like "So what is entropy? Where does it come from?" Answer: it is exactly the same thing as information entropy, and it comes from where information entropy comes from.
- The article would be better if it presented this much more directly. At the moment, what physicists who take this line actually think, and why they think it, is very much buried. It's not surprising that readers who haven't met the idea before are being confused. And in addition, the last paragraph of the current #Entropy and Information theory simply isn't true. There's a very straightforward interpretation of δQ = T dS. It's the definition of temperature -- how much energy you have to put in to increase the information entropy of the system by one nat. In systems of natural units, temperature is measured in (Energy units per nat). Secondly, the second law of thermodynamics. The information interpretation is what reconciles the Second Law with determinism and Liouville's theorem. The laws of the universe preserve information (Liouville's theorem) - but as time goes on, they make it less useful, as more of the information you had becomes related to microscopic correlations rather than macroscopic properties. So the effective amount of information you can use about the system has gone down; corresponding to an increase in its classical thermodynamic entropy.
- Most theoretical physicists think of entropy in this way, I would claim. So, for example, Seth Lloyd, Programming the Universe, just because I happen to have it to hand. Page 65 (and following): "In particular, the physical quantity known as entropy came to be seen as a measure of information registered by the individual atoms that make up matter."
- Information entropy has everything to do with thermodynamic entropy. Jheald (talk) 08:10, 10 July 2008 (UTC)
Further User Comment
I will concede that the yes or no question analogy illuminates the similarity between information entropy and thermodynamic entropy, but it doesn't explain thermodynamic entropy, nor does it pertain to thermodynamic entropy. To say that thermodynamic entropy is information entropy does not explain thermodynamic entropy unless one already understands information entropy which is A DIFFERENT FIELD OF STUDY. At least the vegetable soup part has been moved out of the way of the part that matters. —Preceding unsigned comment added by Kissnmakeup (talk • contribs) 23:32, 14 July 2008 (UTC)
- The information theoretical definition is the fundamental definition of entropy. Most modern textbooks on thermodynamics and statistical physics teach it that way. The old fashioned way of introducing thermal physics in terms of heat engines etc. does not explain what entropy is at all. In that approach you simply have to postulate the existence of a quantity called entropy which is then related to heat and temperature. Neither heat, nor entropy can be rigorously defined in this approach. Count Iblis (talk) 00:21, 15 July 2008 (UTC)
Actually, the statistical mechanical definition makes perfect sense to me. There are x number of possible quantum states with an equal probability of the system being in any one of those states. It's like a digital computer. You have x number of possible yeses or no's, 0's or 1's, trues or falses, or blacks or whites. Whatever you call it, to say that is informational is like saying that two electron's can't have the same quantum numbers because Pauli said so. Two electron's couldn't have the same quantum numbers before Pauli came along. To say that it is because Pauli said so is silly and certainly doesn't explain physically why they can't be the same. Just like information entropy doesn't explain thermodynamics unless you already know it as information entropy. Entropy existed before information science. I think that you should include a discussion about boolean algebra, the binomial distribution, and machine language for those who already undertand the same concepts from still other points of view.
This is the biggest problem in education. To teach, one must explain things in terms that the student can understand. By student, I mean someone who doesn't already know the concepts or the terminology. That is why they are students. The more one learns about something, the less connected he or she becomes with the novice. The concepts and terminology become so entrenched in the brain until it is impossible for an "expert" to look at it from the point of view of someone who doesn't know it yet. People who are emotionally needy try to make another feel stupid because they don't understand it. Such people are incapable of actually teaching, even though they may be employed as teachers. To teach, one must be able to put himself in the shoes of someone who has never heard of it, and respect the effort to learn. Treat the student with dignity, and put it in terms that can be understood. I know that is what you all are trying to do. You're doing a tremendous work in providing this information. I have a great deal of reverence for it, being from a background that is not exactly "two commas". Knowledge should not be for sale.
Now, back to entropy.
Another thing that is confusing about discussions of entropy is the lack of stess on the idea that when a system is disturbed from equilibrium, the entropy is decreased by the constraints or whatever causes the departure from equilibrium. Granted, the definitions state that the entropy of "spontaneous" (meaning "undisturbed?") processes always increases. It seems that the idea of entropy increasing is stressed so abundantly more than the spontaneous part, that it leads a novice to miss or forget the reverse process in which entropy decreases, which is very misleading. Thank you.Kissnmakeup (talk) 11:57, 18 July 2008 (UTC)
Entropy and Information/Communication Theory
This article seems to be an overview of entropy as the term as been used in many disciplines, with the main emphasis on its use in thermodynamics. The section on information theory omits a big split in meaning that is not very consequential, but needs to be pointed out. I believe the Shannon used the term for the information rate of a channel. If we define the information (in bits) of a symbol as $ log_2(1/p) $, where p is the probability of that being emitted by the source, then the entropy is the mean information per symbol emitted, i.e. $ Sum_i, p_i log_2(1/p_i) $. Yet the article contains this snippet: "and the entropy of the message system was a measure of how much information was in the message", which is treating entropy as information.
So one definition makes entropy be a synonym for information, and the other makes entropy be an information rate.
Finally, there is this sentence "For the case of equal probabilities (i.e. each message is equally probable), the Shannon entropy (in bits) is just the number of yes/no questions needed to determine the content of the message." This leads the reader to a psychological idea of entropy, which is very far from the notion of entropy in information theory, with has to do with probabilities of symbols. And the qualifier about equally probable messages is really confusing. I don't know why it is needed, nor what the author was trying to get at. DMJ001 (talk) 04:29, 6 August 2008 (UTC)
- No, this article is specifically about thermodynamic entropy. There is another article, information entropy for the (more general) concept of entropy in information theory. Information entropy appears in this article only because many people find it the most useful way to think about what thermodynamic entropy fundamentally is.
- Entropy in information theory is defined for any probability distribution, and Shannon's original papers do this; that is the most fundamental form of the information entropy concept. Entropy rate, as you correctly explain, has a slightly different meaning. It's true that in information theory people sometimes talk about the entropy of a source, meaning its entropy rate, and compare this to the channel capacity. But questioned on the point, they would freely accept that this is a derived idea, the more fundamental idea being the entropy (or information) of any general probability distribution.
- Not so fast there. Take a look at Shannon's original 1948 paper. He very clearly considers entropy to be information per symbol. Here is one quote: "H or H' measures the amount of information generated by the source per symbol or per second." Then he discusses messages. He gets to the probability p that some message of length N has been generated. He says that the entropy is given by H = log(1/p) / N. (The = has a dot over it to indicate, I think, that this is an estimate based on a sample.)
- DMJ001 (talk) 03:07, 9 August 2008 (UTC)
- Fair enough, though actually Shannon uses both meanings of entropy. In section 20 of the paper (page 35 of the Lucent pdf), he's defining entropy for general pdfs. But for a source, what's more useful is the (average) entropy per symbol - ie the average of the entropy of the probability distribution for the next symbol. It's still an entropy, it's just the entropy of a slightly more specific thing; we can call it the entropy rate if we want to think of it that way, or we can call it the entropy per symbol. Jheald (talk) 18:48, 13 August 2008 (UTC)
- As to the "psychological" view of entropy, I agree this may be a stumbling block to people in the article, because entropy in information theory is just as "real" as entropy in thermodynamics. (In fact User:linas has told me he finds it rather more real [6]). On the other hand, some subjectivity of both information entropy and thermodynamic entropy isn't entirely wrong. (See the maximum entropy thermodynamics article for more discussion). Entropy in information theory is reflective of a state of knowledge; different states of knowledge give rise to different entropies. In thermodynamics too, the Szilard engine thought experiment is an illustration of this - knowing particular microscopic information can sometimes let you extract more macroscopic useful work. It's also a useful idea when thinking about entropy increase in the context of Loschmidt's paradox. So the subjective element may not be completely out of place. But as I've said before, the whole section on the connection between information entropy and thermodynamic entropy could probably stand a re-write. Jheald (talk) 10:35, 6 August 2008 (UTC)
To clarify, are you saying that the sum total of all the possible microscopic energy states along with any constraints on these particles is the equivalent of the information in the so called "message" of the information entropy definition? If so, I still don't understand why one who is trying to learn about thermodynamics must go off on a tangent and take time to also learn about all the nuances of information entropy in order to learn about thermodynamic entropy in order to provide an illustration or example of "entropy" when all a person really wants to know is what is thermodynamic entropy?Kissnmakeup (talk) 14:19, 6 August 2008 (UTC)
- The "message", in the information theory perspective, would be the identification of the single 'true' microscopic state, from out of the set of all the potentially possible microscopic states and their probability distribution, that are compatible with the macroscopic variables and any other constraints on the system.
- I think you are trying to extend the similarities of the entropy of statistical thermodynamics and the entropy of information theory too far.
- But back to my original point about entropy as information vs. entropy as information rate. If you want entropy to be information (and I actually prefer this, even if it was not the way Shannon used the term), then the entropy of a symbol, message, or anything else, is -log of the conditional probability of the symbol, message, or whatever.
- And the "condition" is whatever the observer knows at that point. Thus entropy is relative to an observer. For example, the entropy rate of an AES-encrypted counter is 128 bits per chunk to an observer that does not know the key (it looks like 128 bit random blocks), but is zero to an observer who does (she can predict each output with probability 1).
- DMJ001 (talk) 03:07, 9 August 2008 (UTC)
- Sure, but that dependence on what you know is the case with thermodynamic entropy too. See for example section VI, "the 'anthropomorphic' nature of Entropy" in E.T. Jaynes' 1965 paper Gibbs vs. Boltzmann entropies.
- The -log formula is right if you know the symbol, message or anything else has a particular numerical probability. But if, based on what you know, different symbols have different possible probabilities then the - p log p formula is an appropriate quantification of the quantity of your uncertainty. And if you don't know those probabilities, you should assign them so that the sum - p log p is maximised. (Gibbs' algorithm). See on this Jaynes' original 1957 paper. [7]. -- Jheald (talk) 18:48, 13 August 2008 (UTC)
- The amount of information you are missing, by not yet having received the message, is the information entropy of the message; and also the thermodynamic entropy of the system.
- The reason people find it helpful, as I've tried to write above, is they find it helps them with the question you asked, "what is thermodynamic entropy?" Answer: thermodynamic entropy is missing information; specifically, information about what microstate the universe is actually in. That helps people who worry about Loschmidt's paradox. (How can entropy really increase, if physical dynamics are deterministic and measure-preserving? Answer: the information we had is still there, but in effect unusable, so effectively we might as well forget we ever had it at all). And it meshes well with the Szilard engine scenario. (Learning some information, the entropy really is reduced, and we find we then really can (at least in principle) extract a little more useful work). Jheald (talk) 15:07, 6 August 2008 (UTC)
Now see, this is great. This is exactly what I mean. I'm just beginning to learn about thermo, so I'm not to Loschmidt's paradox or Szilard engines yet. If I were, I'd go read about Loschimdt's paradox or Szilard engines. Right now, as much as I would like to know it all, I have to manage my time because I have constraints on my system, so I have to focus on what really matters right now. I still have what some of us would call a "finite" amount of time to learn what I need to know right now about the basics, which is why there are, I presume, articles about Loschimdt's pair of ducks and lizard engines for those who are to the point that they are ready to learn about those things. Those articles about pairs of ducks and lizard engines could talk about the information entropy in thermodynamic entropy for the sake of those people who want to know about it. Whereas, in this article you could talk more about the curl of F not being zero and what that has to do with entropy, I mean something that I have heard of that at least remotely pertains to this at the appropriate level for this article, unless you want to show off to people who don't have time for it how much more you think you know and try to make students feel stupid for trying to learn, which, I think is what's going on here. You can say, for those readers who want to know about pairs of ducks, lizard engines, and information entropy, "Here, follow this link to there," and not put the entire cyclopedia Jhealdia in this one article. But, thank you for your attention and your contributions to an outstanding project. I really don't have the time to waste here. I don't think I'll be returning to this page, or sending anyone else to it either. Kissnmakeup (talk) 21:02, 6 August 2008 (UTC)
An editor put an Arxiv paper in Further Reading
Here is the paper that was recently added: Umberto Marini Bettolo Marconi, Andrea Puglisi, Lamberto Rondoni, Angelo Vulpiani (5 March, 2008). "Fluctuation-Dissipation: Response Theory in Statistical Physics". Arxiv.org. {{cite journal}}
: Check date values in: |date=
(help)CS1 maint: multiple names: authors list (link)
I'm moving this new item here for discussion, after removing it from the article. Though it is potentially of interest, three points:
- This is only an Arxiv paper, not a refereed publication
- The other items in Entropy#Further reading are of an introductory nature
- The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam. EdJohnston (talk) 21:32, 17 August 2008 (UTC)
In principle, this article could be included, as it is a big review article which is in press in Physics Reports. However, it would not be appropriate to just include this article without explaining the fluctuation dissipation theorem. If we do that, then it is still not clear if this particular review article would be the best reference (I haven't read it yet). Count Iblis (talk) 22:49, 17 August 2008 (UTC)
- Response to "An editor put an Arxiv paper in Further Reading"
First Point : Quote : "This is only an Arxiv paper, not a refereed publication"
Response : It has actually been published in Physics Reports.
Third Point : Quote : "The editor, User:Arjun r acharya, has now added the same paper to eight different articles, which regrettably causes us to worry about spam.
Response : I can assure you that this ("spam") was not intended. The fluctuation dissipation is essentially relating response functions to equilibrium quantities, and it is the notion of fluctuation about equilibrium which ties all of them together (or atleast that's what I had in mind when making the changes). For entropy & fluctuations, see chapter 14 of Pathria.
Second Point : —Preceding unsigned comment added by Arjun r acharya (talk • contribs) 23:22, 17 August 2008 (UTC) In regards to the second point I think that is completely valid (and also applies to the "Second law of thermodynamics" article, where I also appended this reference).On a second look at the entropy article it seems a bit too indepth a reference to be included, and I do understand why one would want to omit such a reference, for the sake of preserving clarity. —Preceding unsigned comment added by Arjun r acharya (talk • contribs) 23:18, 17 August 2008 (UTC)
Edit war
There is an edit war about this addition:-
- Kostic, M. (2008). Sadi Carnot’s Ingenious Reasoning of Ideal Heat-Engine Reversible Cycles (Proceedings of the 4th IASME/WSEAS International Conference on ENERGY, ENVIRONMENT, ECOSYSTEMS and SUSTAINABLE DEVELOPMENT (EEESD'08), Algarve, Portugal, June 11-13, 2008. In NEW ASPECTS OF ENERGY, ENVIRONMENT, ECOSYSTEMS and SUSTAINABLE DEVELOPMENT ed.). WSEAS Press. pp. 159–166. ISBN 978-960-6766-71-8.
{{cite book}}
: Cite has empty unknown parameter:|coauthors=
(help); External link in
(help) (full text)|last=
and|title=
It needs discussion here. If the anon editor restores the material again, I will revert it and protect the article. Bduke (talk) 01:22, 21 August 2008 (UTC)
User comment
I happened apon this article while looking for more information on entropy for a class, and it looks like it should use some clean-up. There's a lot of "entropy can be thought of like A" and "entropy is kind of like B" before any concise definition is given. It seems like the statistical mechanical definition should be given earlier, and then an attempt should be made to show the relationship between this and other ways of thinking about entropy.
That's just my two cents. Thanks for reading! —Preceding unsigned comment added by 168.156.89.189 (talk) 15:54, 12 March 2009 (UTC)
Lede issues (redux)
Just stumbled across this article and I have to say, as a lay reader with no real knowledge of physics, I read the lede and it didn't even seem to say what entropy is. It says what it's used for and how it's defined mathematically, but not what it actually is... rʨanaɢ talk/contribs 11:46, 29 April 2009 (UTC)
- Agreed. It was bad enough that I came here (and pretty quickly saw how the different definitions of Entropy seem to have resulted in a compromise lede involving only uncontentious math). The problem is, this is the entropy article people will wind up going to first, and the lede is useless for a layperson. And I'm talking about pretty nerdy laypeople, too. Better a long lede that explains several meanings than an equation intelligible only to people with no need for the article. 99.192.48.185 (talk) 15:23, 26 July 2009 (UTC)
Entropy and the ergodic hypothesis
With only a few undergraduate physics courses in my background, I'm not really qualified to discuss this issue, but it seems to me that the concept of entropy is related to the ergodic hypothesis. If others agree then I'd invite them to add some relevant material to both this article and to the article on the ergodic hypothesis.
W.F.Galway (talk) 15:38, 13 May 2009 (UTC)
Extensive is too much extention
In the section Entropy and Cosmology, I changed the phrase of "extensive doubt" into "some doubt"- in reality lack of interest in formulating a conclusion doesn't equate to extensive doubt, and this is the real reason for the controversy... no to say that we can find the answer easily, but the current controversy is not followed by an extreme majority of scientist saying entropy models doesn't apply to the Universe as a whole. The jury is still out with competitive arguments, and any extensive doubt phrase is misleading (at least with current arguments) today... —Preceding unsigned comment added by 206.248.106.175 (talk) 20:21, 20 May 2009 (UTC)
Figure is stepping on the text
In Section 4.4 "Chemical Thermodynamics" the figure is covering some of the text. Someone who knows how, please fix it.
- Can you please list your browser and screen resolution? I am not seeing any problems under firefox 3.0.9 at 1280*1024. If you can try multilpe resolutions and list the ones where it is a problem, we may be able to work around it. User A1 (talk) 01:54, 22 May 2009 (UTC)
- Sure. I am using FireFox 2.0.0.14 on MacOS 10.5.6. I tried FireFox 3.xxx but did not like it because of major changes in the way it uses windows. I am using the standard resolution for my brand-new Mac 15" MacBook Pro. Bill Jefferys (talk) 02:45, 22 May 2009 (UTC)
- Someone has fixed this. It now displays correctly (as above). FYI the resolution is 1440x900. Bill Jefferys (talk) 02:59, 22 May 2009 (UTC)
Restrict explanation of entropy to known facts
I recommend that the main article on entropy be limited to its meaning in the context of the Second Law of Thermodynamics, because I found the overall article to be chaotic due to all of the extensions and the resulting lack of any cohesive core description or definition of entropy.
I recommend that information theory be entirely separate because it is itself fundamental, and is mathematics. While the two may coincide in a sense, the entropy of physical processes was historically the first concept / law / process to be developed, and the additional emphasis on math may block some readers from understanding something that does not require math.
I would recommend that all the extensions be described in separate entries, including cosmology, although I grant it is highly relevant. Cosmology remains well beyond any widely accepted scientific formulation; hence, it is little help to my understanding of entropy, and the main article is about entropy, not cosmology.
The main article about entropy should be the minimum required to explain the the Second Law of Thermodynamics to an educated reader who may lack the background in mathematics to grasp its detail but can surely understand the basic concepts. Then the additional concepts which build upon an understanding of entropy will be simplified, because, for example, the reader can proceed to cosmology without trying to grasp the meaning of information theory or entropy as it relates to evolution.
Thomasrwilson (talk) 07:30, 5 August 2009 (UTC)
Re: Restrict explanation of entropy to known facts
After attempting to reread the entry I at least don't think my comments above are especially useful. That is surely in part due to not having read the entry on entropy correctly. Also, I have always found thermo to be exhausting - so simple but it just wasn't given what it needed - an E=Mc^2 formulation - instead it has about 10 different statements. My formulation is as follows:
Entropy is the internal push of all contained systems, including the universe, toward chaos (as such, it would be a force a Force). Or, alternatively, entropy is the chaos that is spontaneously produced by systems, that is, without any outside force or direction or other consideration, systems will produce a certain amount of (useless energy (equivalently, they will lose energy, giving it off as friction), randomness (a synonym for chaos in some situations), they will become smoothed out, as gases will when released in a large room, or they will become randomized, as in the case of encrypted information.
1. I once saw/read a proof of Shannon's theory that information is proportional to N log2 N that used nice (clever) definitions of what information had to be in the context of a bit of data. It wasn't more than a printed page long. Has anyone seen it? I saw it in about 1982 at the dept. of transportation (DOT), transportation systems center (TSC) in Cambridge, MA. Maybe "proof" isn't the right word - it was more a motivation.
But I was also working on digital filtering and was struck by the number of processes that were somehow NlogN -related. Like the FFT and quick sort - they required NlogN steps and were highly efficient. Also, there seemed to be a limiting factor whereby NlogN (log2) represented the degree of compression that was possible - AND it represented, by the same token, the degree of encryption possible. In fact, those things were only correct to a first approximation - given enough memory, sorting can be done almost in a single pass; however, NlogN.
By the way, Shannon is one of the most fascinating characters you could imagine, from his development of information theory and computer basics, to wartime code breaking, to beating the gamblers of Las Vegas and the stock market, ... and of course his unfortunate affliction with Alzheimers. He died in 2001.
I'm out of time.