Talk:Heat/Archive 5
This is an archive of past discussions about Heat. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 3 | Archive 4 | Archive 5 | Archive 6 | Archive 7 | → | Archive 10 |
Exchange of heat energy
I don't like this idea that matter exchanges heat with cooler matter. Matter gives off heat to its surrounding environment. If it's radiant it gives it off equally in all directions. If you want some of it, you have to arrange your physical condition so as to intercept whatever amount you think you need. You can get closer. or in the focus of a parabolic collector. Or get in contact with other matter that has received some of the radiated energy. It's obviously a part of nature's energy dissipation process. And the majority of natures spacial volume is evidently at a temperature of only about 3 degrees kelvin. And so you're used to existing at a temperature of 98.6 degrees Fahrenheit (+296 degrees kelvin), you have to arrange to acquire heat from your environment to reach that goal. But each system of matter delivers heat to its surrounding environment.WFPM (talk) 22:20, 17 March 2012 (UTC)
The way that systems of matter get rid of excessive heat and energy to the surrounding environment is by the progressive interaction between their constituent particles which results in the smallest particles acquiring an equal amount of the momentum of the larger particles but getting practically all the v squared Kinetic energy. Thus the emitted particles are a matter efficient way to get rid of excess kinetic energy from a physical process. If these particles are intercepted the contained kinetic energy can be either reincorporated into the matter of the receiving substance or else reflected away without adsorption.WFPM (talk) 02:08, 19 March 2012 (UTC)
To see how efficient and far reaching this system of heat energy dissipation exists, I refer you to an image of the Whirlpool Galaxy, where the individual matter systems (stars) can be seen giving off heat energy at a distance 4 million times as that from the earth to the sun. And the image shows a process that was occurring approximately 30 million years ago.
And to give you the big picture of all this, you can note that our Milky Way galaxy plus the Andromeda galaxy and the Whirlpool galaxy are all located within a cubic volume of space which only accounts for maybe 1 billionth of the estimated size of the universe.WFPM (talk) 11:51, 19 March 2012 (UTC)
- "I don't like this idea that matter exchanges heat with cooler matter," you say. You may not like it, but you'll have to explain why the Earth radiates thermal in all directions EXCEPT toward the sun. Do you think infrared rays from the Earth find out they are headed toward the Sun (yikes!) and turn around in space and come back? Or that the two kinds of light scatter off each other, or interact in some way like the solar wind? If you think anything like this, you are profoundly mistaken. Electromagnetic radiation doesn't behave like that. SBHarris 18:42, 19 March 2012 (UTC)
- Of course not! What I'm saying is that everything in nature works in relation to its surrounding environment. It's incidental that one system of matter acquires energy from another except in the overall balance of things. Nature has decided to pull matter back together within the physical limits of the matter to retain the accumulated kinetic energy, and has to have a way to redistribute locally accumulated excesses of motion. So it gets rid of it (mostly) by a gradual process of mass-motion emission which we call kinetic energy of matter, when we can detect the matter, and otherwise call electromagnetic energy. It has managed to store a small portion of its mass-motion into atoms with which we are familiar, but that's apparently a pretty slow process. The rest of it just kind of floats around awaiting on future developments. In some places, like in the center of the Whirlpool Galaxy the process of reaccumulation gets capable of swallowing up mass-motion systems without hardly leaving a trace of their previous existence.WFPM (talk) 20:20, 19 March 2012 (UTC)
undid good faith edit of lead; suggestions for editor to make acceptable edits
Dear Andrewedwardjudd, your recent edit (21:30, 18 March 2012) to the lead of the present article on heat is a good faith edit but it does not conform to normal Wikipedia requirements for editing.
Your edit was a change in the lead which was not a summary of the article.
Your edit was not supported by any citation of reliable sources. For historical statements such as those in your edit should be supported by concordant secondary and tertiary sources; primary sources by themselves are not enough for the present purposes.
You edit sought to change the direction of thinking of the article, but it did not express your thoughts in terms of classical thermodynamics, while the main drift of the present article is framed in those terms. Your edit therefore is in some ways incoherent with the rest of the article. An edit should not make an article incoherent.
You have some valid things to say, but if you want to say them, you should do so in accord with normal Wikipedia editing requirements. You should read WP:RS, WP:OR, WP:SYN, WP:POV and related Wikipedia policy articles, and thoroughly digest their contents. You seem at present not to have read or understood or followed them. Till now you have ridiculed my comments along these lines, apparently because you have not absorbed Wikipedia editing policy. Ridiculing my comments may be amusing but does not exempt you from the need to follow Wikipedia policy.
The article on heat at present does not discuss enthalpy, and at present the Wikipedia article on enthalpy has considerable room for improvement. Your idea of heat content is closely related to the idea of enthalpy, which is a quantity of classical thermodynamics, and is sometimes called 'heat content'.
Enthalpy is derivable from internal energy by a Legendre transform, which changes the independent variables of a fundamental thermodynamic equation. This would be made clear in a suitable edit.
It may perhaps be that you can properly edit the article making use of the relation between your idea of heat and the classical thermodynamic idea of enthalpy. This should be done in the body of the article and if it is successfully done there, it will be right to summarize it in the lead.
You may wish to write an edit that sort out or clarify the partly dubious term thermal energy. For this purpose you would need to read, summarize, and cite a good range of suitable secondary and tertiary sources.
Your edit uses special historical terms, such as vis viva, in the lead, but they are not adequately defined in the body of the article. The lead should summarize, not introduce new undefined terms.
Perhaps your interest is more directly historical than physical. If so, your edits should be based on secondary and tertiary sources which are explicitly historical. Ancillary citations of primary sources would then be fitting, in order to back up and exemplify the historical conclusions of the secondary and tertiary sources that are cited as the main reliable sources. For example, you may find statements that support your views in the book by Truesdell on the early history of thermodynamics; Truesdell is an example of a secondary or tertiary historical source. Partington is also often useful. It will be best if you find several more concordant explicitly historical secondary and tertiary sources. Your own reading of primary sources does not by itself constitute reliable sourcing as defined by Wikipedia policy.Chjoaygame (talk) 23:56, 18 March 2012 (UTC)
- You are obfuscating. The article is on heat. It is not on thermodynamics.
- There is no requirement in Wiki to provide references to the fact that Paris is in France
- I already provided verbatim quotations from joule clausius and kelvin to support the text i added
- If a strict thermodynamic heat is so important to you then copy the heat page to Heat(Thermodynamics or Heat(physics) and leave the historical record to be an accurate one.
- You already know you are deliberately setting out to distort the historical record with your endless obfuscations. Now stop being so silly.
Andrewedwardjudd (talk) 05:46, 19 March 2012 (UTC)andrewedwardjudd
More "stuff".
What is this 'explanation' [[2]] doing in the article? Is this the return of 'Caloric'. "A potentially confusing term is thermal energy, loosely defined as the energy of a body that increases with its temperature. Thermal energy, even when not in transit or motion, is sometimes referred to as heat or heat content"? 'Loosely' defined is it? What is this editor trying to say?
Reif ( Fundamentals of Statistical and Thermal Physics ) has been cited, so? On page 269 Reif has:- 1/2mv2 = 3/2KBT.
Read further if you wish but it is quite clear from this that Reif is a fully signed up believer in classical kinetic theory where heat, measured by temperature, is directly related to the energy of moving particles by the Boltzmann constant (KB). --Damorbel (talk) 16:19, 19 March 2012 (UTC)
- In monatomic gases, which is the only thing that equation applies to if you're talking about heat. In solids the equation is the same, but the kinetic energy no longer represents thermal energy but only half of it, which is on the right side. The factor of 3/2 there must be replaced by 3 to get thermal energy in (fully excited) solids, and behold the Dulong-Petit law, but kinetic energy is only half of that total energy. We're interested in liquids, solids, and gases at higher temperatures also. Do you have some mental block which keeps you from reading heat capacity?
- The following is a table of some molar constant-volume heat capacities of various diatomic gases at standard temperature (25 oC = 298 K)
Diatomic gas | CV, m (J/(mol·K)) | CV, m / R |
---|---|---|
H2 | 20.18 | 2.427 |
CO | 20.2 | 2.43 |
N2 | 19.9 | 2.39 |
Cl2 | 24.1 | 3.06 |
Br2 (vapour) | 28.2 | 3.39 |
- The differences in heat capacity per mole of gas here (right hand column), are not due entirely to more kinetic energy per atom. Only half of the difference from low to high, is due to kinetic energy. The reasons are well understood. At least by science and myself. As for you Damorbel, I really don't know what to do with you. SBHarris 18:37, 19 March 2012 (UTC)
- No, Sbharris, it isn't incorrect, Reif didn't get it wrong! Neither is the mv:kT equation restricted to monatomic gases, even though monatomics are the simplest form, the three is due to three dimennsions, in general all matter vibrates (thermally) in three dimensions, the 3 in 3/2 arises from the three degrees of freedom.
- Finally, I do not see how your argument affects the connection between heat and atomic, or even molecular, motions. Care to explain? --Damorbel (talk) 21:04, 19 March 2012 (UTC)
- Would you please read the damn article on heat capacity?? Any connected atoms have more degrees of freedom than just 3 degrees of kinetic energy freedom. They also have 3 degrees of potential energy freedom, which is why your average solid has twice the heat capacity of a monatomic gas (per atom or per mole of atoms). Hence, heat content in solids and many other cases, is not just stored as kinetic energy. The mv:kt equation describes kinetic energy only, but as soon as you move away from monatomic gases, it stops describing TOTAL THERMAL ENERGY (which is only partly kinetic). SBHarris 23:53, 19 March 2012 (UTC)
- It is quite misaken to classify the binding energy of atoms forming a molecule as 'thermal energy', aka heat. This energy corresponds closely with the fusion energy of solids because it does not involve a temperature change i.e., for a change of state, Δ1/2mv2 = 3/2KBT = 0. It is of course, well known that the temperature of water does not change when ice melts to water or water boils to steam. It is true that energy must be added (or extracted) to induce a change of state but the key factor is the constancy of the temperature at the time.
- This thermal/non thermal energy classifiation is quite general, it applies equally to chemical and nuclear 'binding' energy. --Damorbel (talk) 07:17, 20 March 2012 (UTC)
- It is not mistaken to classify part of the binding energy of atoms forming a molecule as heat, if the quantum energies and temperature are such as to permit part of the heat in the system to be absorbed in the binding energies of atoms. This is most simply seen in the heat capacities of gases like bromine (table above) at room temp. Some of heat here is stored in the molecular bond between the bromine atoms, which are no longer vibrating at the ground state, but many instead in the first, second and third excited states, etc. These bonds for these excited molecules are weaker due to the energy they have absorbed. These bonds store part of the heat as their excitation energy. This is basic chemistry. Did you ever take a class in basic chemistry? Why must I explain this to you? SBHarris 22:03, 20 March 2012 (UTC)
- Sbharris. The heat capacity article says "Additionally, some thermal energy may be stored as the potential energy associated with higher-energy modes of vibration, whenever they occur in interatomic bonds in any substance". It seems to me the reason this conversation might be going around in circles is that the potential energy available at higher temperatures is a form of latent heat of a particular kind that is only associated with high amounts of kinetic energy. We can produce maybe a simple model of that by seeing that vibrating ping pong balls tend to only vibrate without leaving the surface at gentle vibrations. If the surface has sloping shelves, as well as a flat surface, we can see that at higher vibrations some billiard balls are in the process of rolling off sloping shelves before falling to the surface, while other balls are pushing them back on the shelf. Some stay up for long periods of time. Either way some are up on the shelf at all times when kinetic energy is highest and cannot get up there without this high kinetic energy. So they then have this potential energy. However this latent energy is only a function of the kinetic energy of the ping pong balls for without that kinetic energy there would be no ping pong balls on the shelves having potential energy. I am not sure if this helps or hinders the discussion. But there it is! :-) Andrewedwardjudd
- So long as you recognize that these balls are not on "shelves" but more like swinging pendulums, or (even more) like balls bouncing up and down on springs. Whatever kinetic energy a ball on a spring has, it always has (on average) just as much potential energy. The same is true of the energy stored in vibration in chemical bonds (which of course is ALL of the thermal energy in a solid). Half is kinetic, half potential. Because of that, instead of specific heats of 3/2 RT per mole like helium gas has, most solids at high temp have twice that, or 3RT. The increase is double, due to the potential energy part of the vibrational energy.SBHarris 03:22, 23 March 2012 (UTC)
- Sbharris. The heat capacity article says "Additionally, some thermal energy may be stored as the potential energy associated with higher-energy modes of vibration, whenever they occur in interatomic bonds in any substance". It seems to me the reason this conversation might be going around in circles is that the potential energy available at higher temperatures is a form of latent heat of a particular kind that is only associated with high amounts of kinetic energy. We can produce maybe a simple model of that by seeing that vibrating ping pong balls tend to only vibrate without leaving the surface at gentle vibrations. If the surface has sloping shelves, as well as a flat surface, we can see that at higher vibrations some billiard balls are in the process of rolling off sloping shelves before falling to the surface, while other balls are pushing them back on the shelf. Some stay up for long periods of time. Either way some are up on the shelf at all times when kinetic energy is highest and cannot get up there without this high kinetic energy. So they then have this potential energy. However this latent energy is only a function of the kinetic energy of the ping pong balls for without that kinetic energy there would be no ping pong balls on the shelves having potential energy. I am not sure if this helps or hinders the discussion. But there it is! :-) Andrewedwardjudd
- Since we know energy is conserved, that we are adding heat to a liquid as it boils, and that the temperature remains constant as this heat as added - would we not conclude that the potential to do work of a system does not vary linearly with the temperature of the system? VQuakr (talk) 08:13, 20 March 2012 (UTC)
- VQuakr, the point is that heat is measured by temperature and energy by Joules. Temperature is not a measure of energy, 1gm of matter may well have the same temperature as 1kg of the same matter but the 1kg will have 103 the energy. Similarly, 1gm of matter at 1000K will have 103 the energy of 1gm at 1K. So 1gm at 1000K has the same energy as 1kg at 1K, OK? --Damorbel (talk) 08:52, 20 March 2012 (UTC)
- Heat is not measured by temperature. Heat is measured in joules also. It has the same units as energy, because it is energy. However, it is degraded energy, so it's not always available for conversion to work. SBHarris 22:03, 20 March 2012 (UTC)
- Damorbel, is your claim that 100g of iron at 1000K will have the same thermal energy as 1kg of hydrogen at 100K? Would you say then that temperature and heat are synonyms? VQuakr (talk) 03:56, 21 March 2012 (UTC)
- VQuakr you ask"... claim that 100g of iron at 1000K ...same ...energy as 1kg of hydrogen at 100K". No, the specific heat per gm of iron and hydrogen are wildly different because their atomic weights (Fe = 56; H2 =2) are similarly different.
- You ask also are 'temperature' and 'heat' synonyms; as far as concerns the 2nd law of thermodynamics, yes. --Damorbel (talk) 08:02, 22 March 2012 (UTC)
- How do you reconcile this with the fact that neither side of 1/2mv2 = 3/2KBT contains a term representing molecular weight? VQuakr (talk) 08:15, 22 March 2012 (UTC)
- Damorbel, is your claim that 100g of iron at 1000K will have the same thermal energy as 1kg of hydrogen at 100K? Would you say then that temperature and heat are synonyms? VQuakr (talk) 03:56, 21 March 2012 (UTC)
- Heat is not measured by temperature. Heat is measured in joules also. It has the same units as energy, because it is energy. However, it is degraded energy, so it's not always available for conversion to work. SBHarris 22:03, 20 March 2012 (UTC)
- VQuakr, the point is that heat is measured by temperature and energy by Joules. Temperature is not a measure of energy, 1gm of matter may well have the same temperature as 1kg of the same matter but the 1kg will have 103 the energy. Similarly, 1gm of matter at 1000K will have 103 the energy of 1gm at 1K. So 1gm at 1000K has the same energy as 1kg at 1K, OK? --Damorbel (talk) 08:52, 20 March 2012 (UTC)
- It is not mistaken to classify part of the binding energy of atoms forming a molecule as heat, if the quantum energies and temperature are such as to permit part of the heat in the system to be absorbed in the binding energies of atoms. This is most simply seen in the heat capacities of gases like bromine (table above) at room temp. Some of heat here is stored in the molecular bond between the bromine atoms, which are no longer vibrating at the ground state, but many instead in the first, second and third excited states, etc. These bonds for these excited molecules are weaker due to the energy they have absorbed. These bonds store part of the heat as their excitation energy. This is basic chemistry. Did you ever take a class in basic chemistry? Why must I explain this to you? SBHarris 22:03, 20 March 2012 (UTC)
- Would you please read the damn article on heat capacity?? Any connected atoms have more degrees of freedom than just 3 degrees of kinetic energy freedom. They also have 3 degrees of potential energy freedom, which is why your average solid has twice the heat capacity of a monatomic gas (per atom or per mole of atoms). Hence, heat content in solids and many other cases, is not just stored as kinetic energy. The mv:kt equation describes kinetic energy only, but as soon as you move away from monatomic gases, it stops describing TOTAL THERMAL ENERGY (which is only partly kinetic). SBHarris 23:53, 19 March 2012 (UTC)
- The 'm' on the LHS is usually considered to be the molecular weight (MW) but in kinetic theory it is just the weight of any particle that freely exchanges energy with the other particles; the inspiration for Eintein's argument for kinetic theory was Brownian motion reported by the botanist Robert Brown in 1827. Robert Brown observed the motion of pollen grains in water, grains that are many many times the weight of water molecules but they still have the same energy (and temperature). I think this is a really difficult matter and was one of the reasons why Kinetic theory took such a long time to become established. The short answer is that, for any collection of particles (they don't have to be similar - Argon and Hydrogen have a mass ratio of 20:1) are exchanging energy through random collision, then all particles have the same energy and thus the same temperature. A particle's kinetic energy is ½mv2 so in a mixture of Argon and H2 the speed of the H2 molecules is sqrt20 times that of the Argon molecules (atoms); this is what keeps the energy of the two species equal. --Damorbel (talk) 08:59, 22 March 2012 (UTC)
Damorbel, having failed to read the article on heat capacity, actually believes it is a constant for any given material, and not a function of temperature, so that thermal energy is linearly dependent on temperature. Well, it isn't. And 1 gram of any matter at 1000 K will have a hugely larger thermal capacity than 1000 grams of the same matter at 1 K, where the heat capacity will be reduced to almost nothing. See Einstein solid and Debye model. SBHarris 07:56, 21 March 2012 (UTC)
- Sbharris you write (above) about my failures and beliefs, even if you were correct in what you write it oes not affect the physics of what I describe. You refer to the heat capacity varying with temperature, of course it does but with complex molecules that e.g. can dissociate at high temperatures (cf associate at low temperature); check this table [[3]] where the heat capacity of water at different temperatures is given. Water has an extremely asymetric molecule with one massive O atom and two tiny H atoms, all three with one unit charge; this gives ise to exceptional intermolecular forces that domninate the form water takes; the energy stored in these intermolecular forces also dominates the heat capacity at different temperatures as well as accounting for the enormous so-called 'latent heat' of water.
- The idea that specific heat of real substances should be some kind of constant is a complete none starter, the energy stored in real materials is a very complex matter which is approximated, sometimes extremely badly, by a single figure such as 'specific heat'.
- None of these facts affect the exact connection between heat/temperature (Kelvins) and kinetic energy (Joules per degree of freedom) as I explained (1/2mv2 = 3/2KBT ) --Damorbel (talk) 09:10, 21 March 2012 (UTC)
- It's not just "complex molecules." Heat capacity isn't constant with any substance except for small ranges of temperature, particularly in the cold. The terms heat capacity and specific heat, however, are exact. They are not appoximations. It's just that they aren't constant, but change and are functions of temperature. This may not affect the relationship between temperature and kinetic energy, but this is not an article about kinetic energy! This is an article about heat, which is not kinetic energy (even thermal energy is only party kinetic energy in most circumstances). So your equation is IRRELEVENT here. Part of heat winds up as kinetic energy and rest does not. The ratios of one to the other are not simple except in solids, where in general one is about half of the other. SBHarris 19:16, 21 March 2012 (UTC)
- Damorbel, can you not agree that the heat of 1kg or iron at 100C is different to the heat of 1kg of lead at 100C?. Earlier we were agreeing that temperature measures intensity of heat? Ie you can supply different amounts of heat to iron and lead to get them to have the same heat intensity or temperature. Ie at 100C iron and lead appear from the outside to have an identical amount of energy contained within them that we call heat. From the outside we have no idea how much heat is inside those different substances at the same temperature **unless** we have already found that out. So they appear from the outside to have the same amount of heat, but the reality is they do not have the same amount of heat. Temperature cannot therefore be a measure of heat. Andrewedwardjudd (talk) 09:28, 21 March 2012 (UTC)andrewedwardjudd
- "[T]he heat of 1kg or iron at 100C is different... " If two items have the same temperature they are equally 'hot', but this is nothing to do withh the amount of energy they contain. Even bodies with the same mass (and temperature) may contain different numbers of 'Joules'.
- The point is that amount of energy (Joules) and heat (hotness) i.e. T(emperature), the driver of the 2nd law of thermodynamics, are quite dfferent matters and should not be confused. --Damorbel (talk) 09:51, 21 March 2012 (UTC)
- I totally agree. However I thought you said that temperature measures heat. And Planck said temperature measures degree of heat or state of heat. We seem to be going around in circles here.
- You said this:
- Read further if you wish but it is quite clear from this that Reif is a fully signed up believer in classical kinetic theory where heat, measured by temperature, is directly related to the energy of moving particles by the Boltzmann constant Andrewedwardjudd (talk) 11:59, 21 March 2012 (UTC)andrewedwardjudd
- "You said this:..." Yes I know I did! There is no contradiction in what I wrote because temperature is 'energy density' which is energy per particle; particle should really read 'degree of freedom' of which every particle has 3 (because it moves in 3 dimensions) hence the 3/2kBT = 3 x 1/2kBT --Damorbel (talk) 16:07, 21 March 2012 (UTC)
- Temperature cannot be energy per particle. It has to be something like energy per unit area?
- I dont like energy density either. Energy intensity sounds better.
- Temperature is a peculiar thing. If you put a thermometer in air almost no particles are in contact with that thermometer. The thermometers final equilibrium temperature is mainly measuring the radiation equilibrium at the thermometer? Anyway we seem to be getting closer to the nuts and bolts of the thing now. Andrewedwardjudd (talk) 16:46, 21 March 2012 (UTC)andrewedwardjudd
- "You said this:..." Yes I know I did! There is no contradiction in what I wrote because temperature is 'energy density' which is energy per particle; particle should really read 'degree of freedom' of which every particle has 3 (because it moves in 3 dimensions) hence the 3/2kBT = 3 x 1/2kBT --Damorbel (talk) 16:07, 21 March 2012 (UTC)
- Damorbel, can you not agree that the heat of 1kg or iron at 100C is different to the heat of 1kg of lead at 100C?. Earlier we were agreeing that temperature measures intensity of heat? Ie you can supply different amounts of heat to iron and lead to get them to have the same heat intensity or temperature. Ie at 100C iron and lead appear from the outside to have an identical amount of energy contained within them that we call heat. From the outside we have no idea how much heat is inside those different substances at the same temperature **unless** we have already found that out. So they appear from the outside to have the same amount of heat, but the reality is they do not have the same amount of heat. Temperature cannot therefore be a measure of heat. Andrewedwardjudd (talk) 09:28, 21 March 2012 (UTC)andrewedwardjudd
You know, you could read the article on temperature. Temperature is very simply a measure of the mean kinetic energy per particle in a system in thermal equilibrium. The two would be measured with the same scale if it weren't for the fact that they (temperature and energy units) were developed historically independently. So now, because of that, we now need a scaling factor between kinetic energy (in joules) and temperature (in kelvins), which are related linearly. That simple scaling factor, which is not a law of nature but just a sacling factor between historical scales, is the Boltzmann constant gives you kinetic energy per particle per kelvin, and the gas constant for kinetic energy per mole per kelvin. That's all they are, and we're done.
Much of the rest of this is obfuscation by Dramorbel, who hasn't "got" the idea that neither heat nor thermal energy are (necessarily) kinetic energy. His equation (3/2)kT = kinetic E, gives how much kinetic energy there is in an object with a temperature, but it doesn't say that this is where all the thermal energy is. It's not all kinetic. Thermal energy is a different sort of energy, of which kinetic energy of atoms may only be a part, depending on the number of degrees of freedom for thermal partition in the system. In a monatomic ideal gas, heat is all kinetic energy, but that's one of the few systems for which that is true. In systems where atoms are bound to each other with chemical bond, or there is electronic exitation, or electrons themselves participate as particles, a lot of heat is other types of energy, often potential. Thus, there is no linear translation from temperature to thermal energy, as there is between temperature and mean kinetic energy. The ratio of the thermal energy (or heat input) to temperature is heat capacity, which is a nonlinear complicated business, although the heat capacities of most substances in practice fall into a fairly narrow range per particle (no more than a factor of 2 differece per particle), at least at higher temperatures (ie, well over the Einstein or Debye temps for that substance, if it is a solid, and another corresponding substance specific reduced temperature if it is a polyatomic gas).
The other problem with this article is that heat has many colloquial uses (one of which is temperature) and many historical uses in physics (on of which we now call thermal energy content). But that hsould simply be pointed out in the lede (as it now actually is, though not optimally), and the historical changes in usage (like how Lord Kelvin used the word "heat") can be left for the history section. When scientists say the word "meter" today, they don't mean what they did in 1890 or 1990. But we leave most of that for the historical section on that subject. SBHarris 17:32, 21 March 2012 (UTC)
- "Temperature cannot be energy per particle". But the equation 1/2mv2 = 3/2KBT is just that, on the left 'm' is the mass of a single particle; on the right kB is the Boltzmann constant, which is the energy of a single particle, independent of its mass, per Kelvin. --Damorbel (talk) 17:25, 21 March 2012 (UTC)
- The equation merely says that temperature is a linear function of kinetic energy per particle. Unfortunately, thermal energy is more than just kinetic, and this TALK page is about heat. The equation above does not apply to thermal energy in general, only to the kinetic energy in an object with a temperature (which is only a part of thermal energy in most situations). Also the equation is incorrect in quantum mechanics, since it predicts that particles have no kinetic energy or velocity at absolute zero, which is wrong. Liquid helium shows you it's wrong. SBHarris 17:37, 21 March 2012 (UTC)
- Sbharris, you write "thermal energy is more than just kinetic [energy?]" and "[my] equation above does not apply to thermal energy in general, only to the kinetic energy". If I understand you correctly, then 'thermal energy' is not a function of temperature; if this is so then why is it called thermal? Further, is it dependent on temperature in any way i.e. what is it? --Damorbel (talk) 18:45, 21 March 2012 (UTC)
- No, I said thermal energy wasn't a LINEAR function of temperature. In a few situations it is nearly a linear function, but this always breaks down as temperatures drop. During phase changes, thermal energy it is not a function of temperature at all (thermal energy rises, temperature does not change), and it is called "thermal" because heat is going in, even though temperature is not changing. Thermal here refers to heat, not temperature (yes, they probably should call it "heat energy," but they don't). As to the dependence of thermal energy in situations where there is a connection, the equations are many and varied and sometimes enormously complicated (as things get cold, heat capacity in crystals actually depends on the CUBE of the temperature) and you can read about some of them in heat capacity and the many other subarticles which I have recommended to you multiple times. If you don't read this, you cannot be educated about it, and will remain ignorant. For which I am sorry, as the needed material is right there in front of you. SBHarris 19:05, 21 March 2012 (UTC)
- Sbharris, you write "thermal energy is more than just kinetic [energy?]" and "[my] equation above does not apply to thermal energy in general, only to the kinetic energy". If I understand you correctly, then 'thermal energy' is not a function of temperature; if this is so then why is it called thermal? Further, is it dependent on temperature in any way i.e. what is it? --Damorbel (talk) 18:45, 21 March 2012 (UTC)
Degree of heat measured by temperature, versus amount of heat measured by heat transfer to a reference object
As described so clearly by Planck in the reference I provided on the main article page, classically a thermometer measured degree of heat, or the ability of heat to flow from one object to another of a lower temperature. But the thermometer gives no measure of the amount of heat that can flow, nor does it measure degree of heat accurately between the arbitarily inscribed marks between the two reference temperatures that produce the scale of the thermometer.
To measure the amount of heat transferred by a tested object at say 100C, you need to transfer an amount of heat to a reference object of 0C, for example water is reference object, where you have a reference temperatures of 100C and 0C for state changes in water and reference heat content of water, and observe the temperature rise. You can then have a calibration table of known power transfers to the reference water to know how much energy was transferred to the water to create the observed temperature change. You then know how much heat energy was transferred from the object under test and then construct tables of relative sensible heat contents for certain temperatures.
So you have two different things.
1. is a measure of hotness or degree of heat by temperature
2. Is a measure of amount of hotness or amount of heat which involves using temperature.
So if you say that temperature is a measurement of heat you are not being clear what you mean
Similarly if you say that temperature is not a measure of hotness or degree of heat you are really mangling our language, to the point that nobody can understand what you are talking about unless they realise you have decided to use the word heat only for what we call amount of heat.
For example when we look at the picture of the Sun in the main article and it begins 'Heat generated by the sun' we are not looking at the amount of heat. We are looking at the degree of heat and we know that is incredibly hot. If you want to be picky the caption should be 'thermal energy generated by the sun, that is being transferred away from the sun as heat'?
Otherwise, maybe somebody who is very strict can help me with my understanding on why heat is being used in the first word of that description? Andrewedwardjudd (talk) 07:00, 20 March 2012 (UTC)andrewedwardjudd
Andrewedwardjudd (talk) 07:00, 20 March 2012 (UTC)andrewedwardjudd
- Andrewedwardjudd, look again at your item 2. Parameters such as temperature generally measure only one thing i.e. they have only one dimension. What temperatue measures is energy intensity, Joules/K, it does not measure the system 'total energy', which would be in Joules. Such 'total energy' in a system could easily comprise energy a different temperatures in different parts of the system, there is no requirement for a thermal system to have a uniform temperature. --Damorbel (talk) 07:38, 20 March 2012 (UTC)
- So we agree that temperature measures degree of heat or 'energy intensity' and you want me to look at 2. the amount of heat.
- We calculate amount of heat, by observing what happens when a tested object of known 'energy intensity' heats a cold reference object, of known 'energy intensity' to a higher 'energy intensity'. We then repeat the heating of the cold reference object of known 'energy intesity' to the higher 'energy intensity' to find amount of heat transferred, when we are heating using a known source of energy transfer.
- When we heat the test object we do so uniformly and for example insulate it very well while it cools to 100. we then place it into the very well mixed water which is continually stirred with a known amount of energy - and so forth. I was not wanting to say it would be easy to measure the amount of heat transferred with a high degree of accuracy.
- My point was that 'energy intensity' *at a surface* is different to 'amount of energy' *in the body* Andrewedwardjudd (talk) 08:12, 20 March 2012 (UTC)andrewedwardjudd
- You write "We calculate amount of heat, by observing what happens when a tested object of known 'energy intensity'...." Life will be simpler if we agree that 'energy intensity' = T!
- Further you have "calculate amount of heat"; if heat is measured by temperature then the 'amount' is how far above 0K the 'T' is. To raise 'T' it is necessary to add energy, not only 'add energy' but the energy has to go into the motion (1/2mv2) of the particles, if it just goes into melting ice (aka 'change of state') the temperature does not change.
- It is not easy to measure the amount of energy added to a system, if the only effect is to raise the system T it is (fairly) easy but you still need to know the specific heat of the system receiving the energy, and you need some sort of calorimeter to measure the energy transferred. Please note that the calorimeter article is completely devoted to the caloric theory of heat, a theory exploded well over 150 years ago; the concept of measuring thermal energy still exists but is still difficult to do, whereas temperature measurement is regularly performed at the nanoKelvin level!
- I certainly agree that "[T] *at a surface* is [can be] different to [T] *in the body*" --Damorbel (talk) 09:45, 20 March 2012 (UTC)
- The whole point of this section is to emphasise that the word heat has to be used in the correct context. Ie Heat is not measured by temperature. Degree of heat or state of heat, is measured by temperature.
- Temperature measures energy 'intensity' *****for that substance****, where two different substances with the same energy intensity have the same temperature.
- So think about that please. You select any substance you want and arrange it so that they all have the same energy intensity and then you find they all have the same temperature of say 100C.
- But you have no idea how much heat was required to get those different substances to have the same energy intensity measured by 100C unless you measured the amount of energy required to raise their temperature by for example heating with electricity and measuring power consumed.
- For example a Meter cube of polystyrene requires almost no heat to be heated to 100C. But a meter cube of water requires a huge amount of heat to be heated to 100C
- Therefore:
- 'Temperature' measures relative hotness or degree of heat
- 'Amount of heat' measures the power/energy required to get objects to be the same relative hotness or temperature.
- These are two different things which involve the useage of the word heat.
- If we talk about heat flow that has nothing to do with temperature until we specify the substance carrying the heat and the rate the substance is travelling at or conducts at. That kind of thing is required to know what the heat flow is, where temperature alone is useless without the additional information - which we might intuitively know even if it is not specified. Eg massive amount of heat flowing from the 5000C sun.
- [[User::::Andrewedwardjudd|Andrewedwardjudd]] (talk) 11:46, 20 March 2012 (UTC)andrewedwardjudd
- You write:-"These are two different [scientific] things which involve the useage of the word heat" And this is allowed to persist in an encyclopedia? It speaks volumes about the competence of the editors! --Damorbel (talk) 17:15, 20 March 2012 (UTC)
- You are it seems wanting the editors to all agree that temperature measures heat?
- Another set of editors are saying that heat means heat content and never meant anything else thermodynamically and only muddled stupid people think any differently
- Another set of editors are saying that historically temperature measured degree of heat or hotness, and the word heat was used with context to mean heat amount or heat quantity or heat amount was specified
- The way it works on wiki is you find references that support your point of view and attempt to get them inserted into the article
- So in my own tiny mind I am streets ahead of many of the other points of view here because I found a guru of thermodynamics who agrees with me about the historical record and nobody can possibly challenge what I am saying about the historical record. :-)
- As I said before the way forwards on wiki is to find quality indisputable references that people find it very difficult to delete without imposing their unsupported opinion on the article.
- By the way is there a copy of the Rief that you referred to and which is in the second reference supporting strictness online? Andrewedwardjudd (talk) 18:42, 20 March 2012 (UTC)andrewedwardjudd
- The Reif link I gave earlier, Fundamentals of Statistical and Thermal Physics, is to the Amazon 'Look Inside' facility for this book; you have to work your way to p269. But you can also find the same formula in Wikipedia [[4]], it is one of those 'general truths' of thermodynamics. --Damorbel (talk) 20:27, 20 March 2012 (UTC)
- Dear Andrewedwardjudd, you write above: "Another set of editors are saying that heat means heat content and never meant anything else thermodynamically and only muddled stupid people think any differently." Dear fellow, you are the one who brought in the word stupid, and other editors haven't used it. It is reasonable to say that you are muddled because your clause "heat means heat content" entirely misrepresents the viewpoint of the editors you are railing against and shows that you are imperceptive of the questions of physics that are involved. Heat content is a very different concept from amount of heat transferred, and that you mix them as just quoted shows that your proto-thought process is not well trained to make the distinction.
- You write: "So in my own tiny mind I am streets ahead of many of the other points of view here because I found a guru of thermodynamics who agrees with me about the historical record and nobody can possibly challenge what I am saying about the historical record." Yes, indeed, it is in your own mind. Some other editors are familiar enough with Planck. One way in which you go astray is to imagine that the historical record is necessarily fully decisive on every question. Several of your statements involve the formal structure of the theory of heat. A historical source that expresses a particular view on the formal structure is primary, no matter how well respected. For a Wikipedia statement about the formal structure, secondary sources and tertiary sources explicitly about that matter are most important. This is because different historical sources express different ideas on the formal structure. Thus, although on matters of substantial physics, Planck counts as a secondary or even tertiary source, on matters of formal structure his view is only one amongst many, and thus his views on formal structure are only primary sources. There is a big split in views about the formal structure for the theory of heat. One major viewpoint is that of Planck and many other very respectable physicists; they think that amount of heat transferred should primarily be defined in terms of calorimetry. The currrently dominant and triumphalist doctrine follows the Bryan–Carathéodory line that amount of heat transferred should be defined by default from an assumption of the law of conservation of energy and from amount of energy energy transferred as work, without reference to calorimetry or to temperature; as I mentioned above under the head created by editor SBHarris, "Heat defined as energy transfer OTHER than work?! I don't think so!", for myself under present circumstances, I would not have the temerity to try to depart from the latter doctrine, which is supported for example by Reif. Editor SBHarris is of the very reasonable but also very contestable (as just mentioned) view that the definition based on this doctrine (in his words copy-and-pasted from above) is "stinking up the lede."
- There are significant problems for the Bryan–Carathéodory doctrine, as made clear enough in Carathéodory 1909. Difficulties in defining work and in dealing with radiation and with friction may be mentioned. To tackle the dominance of that doctrine, instead of making rather scatterbrained edits, you would do better to study and report on the secondary literature about those problems. It would be an uphill task to overthrow the doctrine from its present dominance in the Wikipedia.
- You write above: "So in my own tiny mind I am streets ahead of many of the other points of view here because I found a guru of thermodynamics who agrees with me about the historical record and nobody can possibly challenge what I am saying about the historical record. ... As I said before the way forwards on wiki is to find quality indisputable references that people find it very difficult to delete without imposing their unsupported opinion on the article." Here you are virtually convicting yourself of editing in bad faith. The difficulty in deleting your edits is that you have insufficient respect for the views of other editors and insufficient understanding of the questions involved, so that it is apparent to other editors that simple deletion your edits will be followed by your undoing of the deletion, without adequate response on your part in dealing with the flaws of your edits. The difficulty in deleting your edits is not as you suppose the authority of their proposed sources. It is apparent that you have not yet comprehended the Wikipedia reliable sourcing policy.Chjoaygame (talk) 00:12, 21 March 2012 (UTC)
- "The currrently dominant and triumphalist doctrine follows the Bryan–Carathéodory line that amount of heat transferred should be defined by default from an assumption of the law of conservation of energy and from amount of energy energy transferred as work, without reference to calorimetry or to temperature; as I mentioned above under the head created by editor SBHarris, "Heat defined as energy transfer OTHER than work?! I don't think so!", for myself under present circumstances, I would not have the temerity to try to depart from the latter doctrine, which is supported for example by Reif."
- So you agree in fact that the historical useage was different and that my observations are in fact totally valid and justified. Andrewedwardjudd (talk) 06:13, 21 March 2012 (UTC)andrewedwardjudd
- You write above: "So in my own tiny mind I am streets ahead of many of the other points of view here because I found a guru of thermodynamics who agrees with me about the historical record and nobody can possibly challenge what I am saying about the historical record. ... As I said before the way forwards on wiki is to find quality indisputable references that people find it very difficult to delete without imposing their unsupported opinion on the article." Here you are virtually convicting yourself of editing in bad faith. The difficulty in deleting your edits is that you have insufficient respect for the views of other editors and insufficient understanding of the questions involved, so that it is apparent to other editors that simple deletion your edits will be followed by your undoing of the deletion, without adequate response on your part in dealing with the flaws of your edits. The difficulty in deleting your edits is not as you suppose the authority of their proposed sources. It is apparent that you have not yet comprehended the Wikipedia reliable sourcing policy.Chjoaygame (talk) 00:12, 21 March 2012 (UTC)
- Without prejudice as to the validity of your observations, what also matters here is how you do your editing.Chjoaygame (talk) 09:34, 21 March 2012 (UTC)
- No. What matters is that editors do their best to present information from reliable sources accurately
- 1. You began by claiming the historical record did not exist
- 2. Earlier today you agreed the useage of heat was changed by at least some authorities
- 3. And now, even though you agree the historical useage has changed, after days of obfuscation and bullying you still cannot allow yourself to formulate an expression that concedes you were totally wrong. Instead you have to come up with the very very silly "Without prejudice as to the validity of your observations"
- 4. Why are you being so silly about this topic? You agree the historical useage was changed some time in the last century by some authors and you seem to know in extreme detail everything about it. That is really weird behaviour on your part. Andrewedwardjudd (talk) 12:06, 21 March 2012 (UTC)andrewedwardjudd
- Without prejudice as to the validity of your observations, what also matters here is how you do your editing.Chjoaygame (talk) 09:34, 21 March 2012 (UTC)
Subsection
'Greenhouse effect' has difficulty with heat and thermal effects in general, frequently GHE editors do not engage in discussion, some have been banned for abuse of contributions from other editors. --Damorbel (talk) 17:03, 22 March 2012 (UTC)
- If you are wondering what this funny little section is doing here you might feel relieved to know that editors of the Greenhouse effect feel insulted when contrbutors point out that their so-called 'effect' attempts to drive a coach and horses through the 2nd Law of Thermodynamics and have censored their contribution, " Cold upper greenhouse gases are heating the hotter lower levels with backradiation - or else!: removing edits by topic-banned editor " see here and [[5]]. --Damorbel (talk) 12:53, 23 March 2012 (UTC)
- A mirror can reflect radiant heat without getting as hot as the thing it is heating. Does a mirror count as a thermal region? A lens can be used to focus light to heat an object to a higher temperature than the lens itself. Does a lens count as a thermal region?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 15:58, 23 March 2012 (UTC)- You will have to explain what a 'thermal region' is. Sorry, I've never heard of such a concept. --Damorbel (talk) 16:26, 23 March 2012 (UTC)
- The top of this article says: "According to some authorities in physics, chemistry, engineering, and thermodynamics, heat can only be energy produced or transferred from one body, region, set of components, or thermodynamic system to another in any way other than as work.[1][2]" So a "thermal region" constitutes a region through which heat may exist. I hope that explains it.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 16:36, 23 March 2012 (UTC)- As you say 'thermal region' appears at the top of the article. Um, er, I'm still baffled!
- The article also has:- "A related and potentially confusing term is thermal energy, loosely defined as the energy of a body that increases with its temperature" (!) I'm even more baffled!
- I suggest that such 'stuff' is written by editors unfamiliar with any consistent theory of thermodynamics so write on the basis of what they learned when on a creative writing course. --Damorbel (talk) 18:32, 23 March 2012 (UTC)
- The top of this article says: "According to some authorities in physics, chemistry, engineering, and thermodynamics, heat can only be energy produced or transferred from one body, region, set of components, or thermodynamic system to another in any way other than as work.[1][2]" So a "thermal region" constitutes a region through which heat may exist. I hope that explains it.siNkarma86—Expert Sectioneer of Wikipedia
- You will have to explain what a 'thermal region' is. Sorry, I've never heard of such a concept. --Damorbel (talk) 16:26, 23 March 2012 (UTC)
- A mirror can reflect radiant heat without getting as hot as the thing it is heating. Does a mirror count as a thermal region? A lens can be used to focus light to heat an object to a higher temperature than the lens itself. Does a lens count as a thermal region?siNkarma86—Expert Sectioneer of Wikipedia
The term "thermal energy" needs homework
The term "thermal energy" is not a standard term strictly defined by standard texts that I am familiar with. I think some homework needs to be done on this term so that this term should be well sourced from reliable sources, or that it should be made explicitly clear in the article that it is not to be found in reliable sources, or that the term should be removed from the article.Chjoaygame (talk) 03:48, 24 March 2012 (UTC)
- The term is used in the literature and you can google it: for example [6]. Let's see you find me scholarly discussions of "heat storage" systems. Mostly they talk about "latent heat". Though, to be sure, I did find some papers that talked about heat storage AS thermal energy, which makes this rather hard to dissect. SBHarris 05:10, 24 March 2012 (UTC)
- Still I think the Wikipedia article entry on "thermal energy" needs proper homework or reliable sources. A mere Google search is not a safe way to find a reliable source. The Wikipedia is not a competitor with Google, which shows whatever one wants, without regard for reliability.Chjoaygame (talk) 05:50, 24 March 2012 (UTC)
- I do not have any difficulty with 'thermal energy' as a term in thermodynamics, in my experience it is the kinetic energy of a system of particles due to their motion. What causes confusion is naming none kinetic (potential energy) processes such as condensation and crystallisation as 'forms of heat' e.g. latent heat. Latent heat is a misnomer, it arose even before the caloric theory was devised. What is called 'latent heat' is energy stored in one of the ways that does not change the temperature, with this argument it is quite possible to describe chemical energy as 'latent heat' for the same reason, neither have anythng to do with the definition of temperature so the connection with heat is really none existent. --Damorbel (talk) 07:41, 24 March 2012 (UTC)
- Still I think the Wikipedia article entry on "thermal energy" needs proper homework or reliable sources. A mere Google search is not a safe way to find a reliable source. The Wikipedia is not a competitor with Google, which shows whatever one wants, without regard for reliability.Chjoaygame (talk) 05:50, 24 March 2012 (UTC)
- Your "experience" is wrong except for helium and monatomic gases. The ordinary "heat content" or thermal energy content in ordinary materials like iron metal, is half potential energy. There is nothing "latent" about it. This type of potential energy rises linearly with temperature (at least it does, above the Einstein or Debye temps of the substance, which are close to each other). But it is as much potential energy as the energy of a phase change. Each time you spread your boloney, Damorbel, I'm going to correct it. SBHarris 18:09, 24 March 2012 (UTC)
- What is "There is nothing "latent" about"? Are you not aware that Black's 'latent heat is about melting (fusion) and boiling (evaporation). Substances changing state from solid to liquid; liquid to vapour (and reverse) need energy (or give up energy) without changing temperature. Black called it latent heat because there is no associated change of temperature with the melting/freezing or boiling/condensing; if the energy changes without temperature change then it is not a thermal process! (BTW, when describing something as 'bolony' I suggest your description will be more easily accepted if you support it). --Damorbel (talk) 21:35, 24 March 2012 (UTC)
- "Thermal energy" is stored also (as all agree) in substances that are heated and change temperature as a result. In sustances that do not change phases, but store thermal energy that increases with temperature. I said there is nothing "latent" about this type of "heat content" (more properly, thermal energy content). And yet, this type of thermal energy content is only 50% kinetic energy of atoms in solids. Is there something about this that you don't understand? And if not, why not? SBHarris 19:03, 25 March 2012 (UTC)
- What is "There is nothing "latent" about"? Are you not aware that Black's 'latent heat is about melting (fusion) and boiling (evaporation). Substances changing state from solid to liquid; liquid to vapour (and reverse) need energy (or give up energy) without changing temperature. Black called it latent heat because there is no associated change of temperature with the melting/freezing or boiling/condensing; if the energy changes without temperature change then it is not a thermal process! (BTW, when describing something as 'bolony' I suggest your description will be more easily accepted if you support it). --Damorbel (talk) 21:35, 24 March 2012 (UTC)
- The problem is not what you or other editors do or do not have difficulty with. The problem is to present material that has reliable sources. Homework is needed to find out whether reliable sources exist for the term "thermal energy".Chjoaygame (talk) 10:51, 24 March 2012 (UTC)
- If you adopt this reasoning you will quickly run into policy limitations because it is perfectly possible to find 'reliable' sources, particularly historical ones, with every sort of definition. The current article on Thermal energy has a link to a book Thermal analysis of materials where it says (p2) "It would be inappropriate to refer to an object as having "heat"". Which is of course quite incorrect. The expression should be "it has a temperature of.... ". The author then goes on to say: "Rather it would be stated that it has a certain temperature [good!] or a certain themal energy [bad!]". Clearly this author has not thought the matter through, temperature is a local concept, energy is a bulk concept, they can never measure the same thing. The publisher wants £120.65 for this book, it is a ripoff! The author Robert Speyer teaches industrial ceramics at Georgia College of Engineering so perhaps his current specialty is not thermal physics. --Damorbel (talk) 13:15, 24 March 2012 (UTC)
- Damorbel writes: "it is perfectly possible to find 'reliable' sources, particularly historical ones, with every sort of definition." This comment is self-contradictory or belies the nature of a reliable source. It is of the essence of a reliable source that it not be one of a set of "every sort" of thing. In favour of this comment is that it highlights the care needed in citing historical sources.Chjoaygame (talk) 20:40, 24 March 2012 (UTC)
- Yet he is right and you are wrong. Temperature is the bulk concept, since it requires a system of many particles, and is statistical. Energy may be bulk, or it can local (you cannot talk about the temperature of one particle, but the energy of one particle is perfectly well-defined). SBHarris 18:09, 24 March 2012 (UTC)
- If you adopt this reasoning you will quickly run into policy limitations because it is perfectly possible to find 'reliable' sources, particularly historical ones, with every sort of definition. The current article on Thermal energy has a link to a book Thermal analysis of materials where it says (p2) "It would be inappropriate to refer to an object as having "heat"". Which is of course quite incorrect. The expression should be "it has a temperature of.... ". The author then goes on to say: "Rather it would be stated that it has a certain temperature [good!] or a certain themal energy [bad!]". Clearly this author has not thought the matter through, temperature is a local concept, energy is a bulk concept, they can never measure the same thing. The publisher wants £120.65 for this book, it is a ripoff! The author Robert Speyer teaches industrial ceramics at Georgia College of Engineering so perhaps his current specialty is not thermal physics. --Damorbel (talk) 13:15, 24 March 2012 (UTC)
Let's have the articles Heat (work) and Heat (energy)
Work and the technical definition of heat share something in common in that they are process quantities. Most people tend to think of heat as a kind of energy, as opposed to a kind of work. So how about this distinction then: Let's have the articles Heat (work) and Heat (energy). Anyone in favor say Aye! or Support.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 04:01, 24 March 2012 (UTC)
- NayChjoaygame (talk) 04:28, 24 March 2012 (UTC)
- Aye: My contention is that it would be a good idea to separate the "work" and "energy" notions of heat accordingly. This way, we don't confuse people by mixing up statements applying to process quantities with those of state quantities. I suppose your contention is that we put these separate notions in the same article, potentially causing editors to insert these contrasting notions even in the same sections. I am in favor a structure that resembles 11111 00000 and not one that resembles 10110 00101.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 04:56, 24 March 2012 (UTC) - Nay Work is measured in Joules because it represents a change of energy. Heat is measured by temperature i.e. Kelvins, not energy i.e. Joules, so what are sections called Heat (work) andHeat (energy) supposed to contain? No support from me for this suggestion, sorry! --Damorbel (talk) 07:58, 24 March 2012 (UTC)
- "Heat is measured by temperature i.e. Kelvins, not energy i.e. Joules,"
- Are you really that oblivious to the fact that , the symbol for heat, has units of joules?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 14:38, 24 March 2012 (UTC)
- "oblivious"? Of course is the symbol for 'heat energy' it is the energy due to the motion of particles in a thermodynamic system thus it should properly be called thermal energy. It is called energy because it is 'extensive' i.e. the more material (the larger the number of particles) the greater the energy. While it is true that the energy of such a system increases as the temperature of the particles increases there is no requirement for equilibrium so there is no restriction on the energy of the individual particles i.e. not only do the particles each have their own energy they must, according to the formula from Reif:- 1/2mv2 = 3/2KBT, have its own temperature. In addition, if the system is in equilibrium, thus the particles have an energy distribution according to the Maxwell-Boltzmann distribution then a system temperature exists. --Damorbel (talk) 11:01, 25 March 2012 (UTC)
- Q is the symbol for all kinds of heat whatsoever, and they are all measured in units of joules or some other equivalent energy unit. Sorry.SBHarris 18:45, 25 March 2012 (UTC)
- Is it possible for a system to have exactly a Maxwell-Boltzmann distribution, or does a system have but one choice but to approach such a distribution, while not ever matching it exaxctly?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 17:23, 25 March 2012 (UTC)- The M-B distribution is a statistical thing. The more particles in a system, the more closely will it approach this distribution. This has actually been measured by letting molecules of a gas through a tiny hole into vacuum and measuring their individual velocities one-by-one. By the time you get a few thousand, it's correct to less than a percent. A few million and far better than that. It's much the same statistical problem as taking Gallup poll: how large does your sample need to be before it is accurate to a given tolerance? SBHarris 18:50, 25 March 2012 (UTC)
- Is it possible for a system to have exactly a Maxwell-Boltzmann distribution, or does a system have but one choice but to approach such a distribution, while not ever matching it exaxctly?siNkarma86—Expert Sectioneer of Wikipedia
- Q is the symbol for all kinds of heat whatsoever, and they are all measured in units of joules or some other equivalent energy unit. Sorry.SBHarris 18:45, 25 March 2012 (UTC)
- "oblivious"? Of course is the symbol for 'heat energy' it is the energy due to the motion of particles in a thermodynamic system thus it should properly be called thermal energy. It is called energy because it is 'extensive' i.e. the more material (the larger the number of particles) the greater the energy. While it is true that the energy of such a system increases as the temperature of the particles increases there is no requirement for equilibrium so there is no restriction on the energy of the individual particles i.e. not only do the particles each have their own energy they must, according to the formula from Reif:- 1/2mv2 = 3/2KBT, have its own temperature. In addition, if the system is in equilibrium, thus the particles have an energy distribution according to the Maxwell-Boltzmann distribution then a system temperature exists. --Damorbel (talk) 11:01, 25 March 2012 (UTC)
- "so what are sections called Heat (work) andHeat (energy) supposed to contain?"
- Heat (energy): Heat as "stored energy" or "heat content" or "thermal energy" (i.e. the colloquial usage of the term)
- Heat (work): Heat as a "transfer" of energy from one body to another, referring not to a substance, but the process of energy exchange by means of heat transfer.
- Is that clear enough?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 14:45, 24 March 2012 (UTC) - It would be a shame indeed if good ideas go to waste simply because people didn't understand the merits of them at first glance.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 16:51, 24 March 2012 (UTC)
- Nay Besides, there is a big problem with heat (work), which doesn't really mean anything. It's a bad synonym for heat transfer, since transferred heat often does no work at all. It has units that are the same as work, but it surely is NOT work. In fact, as noted above, in thermodynamics, it is explicitly not work (thermodynmics). Heat cannot be converted to work unless you have a reservoir at 0 K, or you pay the entropy cost of destroying the heat in some other way. So it's a very bad idea as a name for an article. Heat (energy) suggests that there is some other kind of heat that is NOT energy, which is also wrong. SBHarris 17:55, 24 March 2012 (UTC)
Then what about Heat as energy, Heat as a transfer of energy?
- Why not?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 18:50, 24 March 2012 (UTC)
- The proposal was not a move anyway. So I went ahead and defied convention. I have now have split the articles. Your job now is to delete them. Enjoy.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 19:30, 24 March 2012 (UTC) - I have been guided to the talk page. No satisfying explanation was made about my recent redirect. Apparently no detailed explanation is offered by the respondent, as explanations are certainly not forthcoming.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 19:45, 24 March 2012 (UTC)
Splitting this page up, without consensus to do so is unacceptable. Please don't do it again; it has been undone, wasting people's time William M. Connolley (talk) 19:55, 24 March 2012 (UTC)
- Well you better find a way to prevent all the stupid arguments I have been seeing being debated this talk page for a long time! This isn't really about me. This is about saving the time of other people, not mine. I spent more time splitting the article than you have in reverting it.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 20:02, 24 March 2012 (UTC)
- Even with one article, you can try to do an even stronger job of "differentiating" these two types of "heat" (one of which is historical) in the lede, and then do the "split" within the article, discussing what we now call "thermal energy content" under "historical heat content" or something. SBHarris 17:15, 26 March 2012 (UTC)
- I have found a text that lists "thermal energy" in its index. Lewis, G.N., Randall, M. (1923), Thermodynamics and the Free Energy of Chemical Substances, McGraw-Hill Book Company, New York. It lists occurrences on three pages 53, 73, and 126. On page 53 we read: "It is in fact the limit of any rational thermodynamic scale, a true zero where all thermal energy would cease; but as unattainable as the other limit of our scale, the infinite temperature." (Apparently these 1923 authors do not admit the zero-point energy as being 'thermal', although it was already on the map.) On page 73 we read: "This phenomenon is explained by assuming that in addition to the thermal energy of the atoms, certain electrons in these electropositive metals are held by such weak restraints that they also acquire thermal energy." On page 126 we read: "In this spontaneous irreversible process the energy of translational motion is converted into the energy of chaotic motion, which we call heat." (I thought you would like that.) In small print we read also: "The distinction between the energy of ordered motion and the energy of unordered motion is precisely the distinction which we have already endeavoured to make between energy classified as work and energy classified as heat." The actual phrase "thermal energy" does not appear on that page.
- On page 53, these authors also write: "There are two terms, "heat" and "work", that have played an important part in the development of thermodynamics, but their use has often brought an element of vagueness into a science which is capable of the greatest precision. It is not, however, convenient to avoid altogether the use of these terms, although unfortunately their best definition can be given only after the introduction of certain ideas which will be developed in a later chapter. For our present purpose we may say that when a system loses energy by radiation of thermal conduction it is giving up heat; and that when it loses energy by other methods, usually by operating against external mechanical forces, it is doing work." I did not find a concise definition of heat or of work later in the text, "after the introduction of certain ideas".
- I think these authors are out on their own in seeking internal "energy that can be classified as work", and that makes it problematic as to how they can find internal "energy classified as heat". It is obvious nonsense to suggest that the ideas of heat and work might be "avoided" in thermodynamics. Lewis also proposed that photons are conserved inside atoms. I think this text is not a reliable source for Wikipedia to establish a term "thermal energy".Chjoaygame (talk) 21:01, 26 March 2012 (UTC)
- I would not trust a 1923 science text as reliable, except for what science believed in 1923. Too much changes. However, the idea that an object can contain internal energy that is "work" is in reference to work (thermodynamics) which simply classifies all energy that isn't heat, as work. So adding gasoline to your car or charging its battery is doing work on it, and increasing its energy. And doing so in a way that is fully reversible-- it can give you the work back with 100% efficiency (at least so far as entropy is concerned) because you did NOT add it thermally, and it was not stored thermally. So I see where they are going. Any energy you add without the entropy of the system increasing, in thermodynmics is called "work", even if some of it is things like adding gasoline rather than pushing or pulling on a rubber band (which adds potential energy to a system that you can completely get back). And of course Lewis is wrong about photons. He named them, but he's wrong about them being conserved. SBHarris 22:28, 26 March 2012 (UTC)
- I think these authors are out on their own in seeking internal "energy that can be classified as work", and that makes it problematic as to how they can find internal "energy classified as heat". It is obvious nonsense to suggest that the ideas of heat and work might be "avoided" in thermodynamics. Lewis also proposed that photons are conserved inside atoms. I think this text is not a reliable source for Wikipedia to establish a term "thermal energy".Chjoaygame (talk) 21:01, 26 March 2012 (UTC)
- I suppose you intend to get the potential energy back by raiding the gasoline tank. But what if there was already some ethanol in the tank when you poured in the gasoline? You would need a suitable semipermeable membrane and a long time to get your potential energy back? Do you intend some other way of getting the potential energy back?Chjoaygame (talk) 23:51, 26 March 2012 (UTC)
- Ah, good caveat. Indeed if I added the chemical so that there was some entropy of mixing ΔS, then I now have lost at least TΔS from my free energy, and can never get it back as work. So heat is not the only way to lose energy to useful work, forever. Anything you do that increases entropy will do it. That's one reason to define "work" as any addition of energy that doesn't increase entropy, but if you define it as any addition which isn't heat, you can get nailed by just such an example as we're discussing. If you define thermodynamic work as any process that increases energy other than thermally, then you can in theory add some thermodynamic work that doesn't wind up as heat, but still ends up degraded and is not a change in free energy. Many sorts of expansion and mixing processes can result in loss of free energy just as though you allowed your energy input to be degraded into heat. But I'm pretty sure that thermodynamic work on a system is meant to be fully reversible. That needs checking. SBHarris 00:13, 27 March 2012 (UTC)
- I suppose you intend to get the potential energy back by raiding the gasoline tank. But what if there was already some ethanol in the tank when you poured in the gasoline? You would need a suitable semipermeable membrane and a long time to get your potential energy back? Do you intend some other way of getting the potential energy back?Chjoaygame (talk) 23:51, 26 March 2012 (UTC)
- I haven't checked the Wikipedia article on work. I have enough to keep me busy without that.
- The usual definition of work is in terms of mechanical or mechanical-like (electric currents, laser light, chemical potential energy, etc.) variables of an external system. It doesn't concern itself with entropy or heat; no mention of them in the definition of work. (If the Wikipedia article on work says the contrary—as I think you said it does?—, it is way off beam, and needs correction; but I don't want to know about it; I have other fish to fry.) That is the key to the Bryan-Carathéodory-Born doctrine that is accepted by many keen theoreticians today. At least not directly. They say "Oh, we will consider changes that can be fully described by those mechanical or quasi-mechanical external variables, and we will use "adiabatic" partitions that are perfectly reflective and do not allow any factor other than those mechanical or quasi-mechanical external variables to affect the system. These will tell us all about work." Then they say "There are however changes that are not explained by work, that occur when we remove the magic adiabatic partitions. These changes will admit the idea of heat transfer." Carathéodory admits that there are some fuzzy edges here, to do with radiation, gravity, turbulence. There are problems in defining work for open systems, but they are not prohibitive.
- But the definition of work does not require that the entropy of our system not be increased by it, nor that the work be recoverable as work; far from it. Thermodynamic work on a system is not required to be fully reversible and often is not reversible, even when administered infinitely slowly. The essence of work is just that it be administerable by externally controllable macroscopic mechanical or quasi-mechanical devices. Vital here is that the number of mechanical and quasi-mechanical external devices is small enough to count on your fingers and toes; not millions of them as for the positions and velocities and states of the constituent particles, thermodynamic information about which is carried by temperature and entropy.
- The Bryan-Carathéodory-Born doctrine is perfectly logical and much beloved of theoreticians for that reason. They will hear of no objection to it. They look down their noses at people who like to think that the law of conservation of energy was not secure till calorimetric measurements had been empirically related to mechanical ones without prior postulation of the law of conservation of energy. That means that calorimetric measurements are empirical facts which are very hard to reproduce without calorimetry of some kind and at the same time without simple postulation of the law of conservation of energy. The theoreticians say "Oh, we know that all matter is composed of atoms and the like and that Newton's laws applies to it and they establish the law of conservation of energy for all matter and this has to include heat. Don't waste our time with bleating about calorimetry as being hard to reproduce by methods that are non-calorimetric and do not postulate in advance the law of conservation of energy." They are very proud of their understanding, and will brook no criticism. They just don't care about empirical approaches that actually deliver the goods for their theories.
- For the definition of heat as energy transferred other than by work, this is not a problem. Any problem lies in the definition of work. If transfer by work cannot be defined for a process, then a fortiori no question of definition of heat arises, with this definition of heat. If one is looking empirically for ways of identifying transfer as heat, one checks the list of all one's known mechanical and quasi-mechanical variables that account for work transfers, and if one's candidate transfer is not on that list one is likely to recognize it as a heat transfer. That does not commit one to not admitting that one's transfer is not to be measured by calorimetry. The theoreticians will carry on that calorimetry is really a disguised mechanical measurement, postulating a priori the law of conservation of energy; that is their privilege and pleasure, but one can still do the calorimetric measurement, and it may be the only empirical way open in some situations.Chjoaygame (talk) 12:51, 27 March 2012 (UTC)
Well, we do in fact have two articles on work. One is work (physics) which is synonymous with force x distance = mechanical work. It probably includes (as a subset) electrical work, even though when you charge a battery that's hardly mechanical work. The action of electrical fields on charges, however, gives a "force" and one actually does move such a charge through a distance when charging a battery, so this is very similar. Other types of work with forces other change mechanical/contact forces (like gravitational work when a book falls off a table inside a closed room) can be treated similarly.
The other article is work (thermodynamics) which defines this term as any energy change to a system that isn't a thermal/heat change. So this "work" includes chemical potentials in systems, which are not mechanical work, but are something like changing a dead battery for a fresh one in a system, rather than charging the dead one from a battery charger (which of course does electrical work on it). Whether changing a battery is using a "mechanical and/or quasi-mechanical external device with a number small enough to count on your fingers and toes" depends on your perspective. A fresh battery is ONE thing, to be sure, bui it consists of many mols of fresh/different atoms all in a different state, so in that sense, you've done a lot of switching. The fact that in a battery you can do this all at the "same time" instead of by means of a zillion different mechanical "strikes" of hot atoms (heat transfer) is related to the fact that changing a battery, like charging a battery, doesn't change entropy (much), and need not change it at all.
The problem with "radiation" (let us just examine electromagnetic radiation) is that you can add it thermally or not. Clearly, if you irradiate a cold system with radiant heat at some blackbody temperature (like the Sun does for Earth) you are adding "heat." But if you irradiate the body with a monochromatic laser beam, you are not. (To be sure, such energy can be converted to heat immediately, but unlike the other heat, it need not be, and could be stored (in theory) a photon at a time to increase some chemical potential and never appear as thermal energy at all). So, light energy delivered by laser has no entropy and no temperature. What kind of work is it?
Much the same thing happens with molecular/atomic impacts. You can warm a system by bathing it in a gas at some temperature, and that guarantees heat transfer. But if, instead, you project molecules at it, each one of which are at exactly the same velocity, now you're not transfering entropy unless some process that makes it happens. You could in theory (even without a Maxwell Demon) catch all those atoms, or charge them, and use their monochromatic kinetic energy to raise the potential of the system in a way that didn't increase its entropy. Put these zillions of atoms into lumps, and it's like firing bullets at the system. By the time you get to one "bullet" (or one piston) it's clear you now are doing mechanical work on it. So where does the "fingers and toes" number come in, inasmuch you can divide your mechanical impacts into as many subunits as you like, just as in the case of the (one) fresh battery composed of many, many new chemically altered atoms? Again, the fact that you CAN do this has to do with the low phase-space content of your addition. That's what allows you to collect it into a "fingers and toes" number like with ONE battery or ONE bullet or ONE piston (or ONE set of photons all in the same state, as in a laser beam). But you don't HAVE to do it. With heat, you do have to do it; the fact that you can't collect the hot particles in the same way, or transfer their kinetic energy in the same way, is intrinsically connected with the higher entropy of the (kinetic) energy that they carry.SBHarris 18:13, 27 March 2012 (UTC)
- While the pioneers would say that heat transfer was a primary idea, I don't think they would take such an extremely ideological approach as to make it define work. I think the idea would not occur to them to do so. If pressed on the point, I think, they would regard work as also a primary idea in its own right. They derive the idea of internal energy from the combination of the two primary ideas, work and heat transfers. Without them, well defined and distinguished, thermodynamics doesn't move.
- A separate Wikipedia article on work for physics as distinct from thermodynamics should not be necessary according to the Bryan-Carathéodory-Born tradition. In the Bryan-Carathéodory-Born tradition, the thermodynamic definition of work is primary and is specifically constructed to entirely avoid all mention of heat, temperature, and entropy. If the Wikipedia article on thermodynamic work simply defines it, as you seem to say (I am not going to read it because I don't want to be distracted from other things right now), as energy transfer other than by heat transfer, then the Bryan-Carathéodory-Born tradition boys would be livid if they read it. This seems like a very serious inconsistency in the Wikipedia, because the Bryan-Carathéodory-Born tradition boys hold the field in many thermodynamics articles.
- As for how many fingers and toes, the usual idea is of "separation of time scales". This is another way of saying what you said above, that "With heat you have to do it." Thermodynamics is about processes which are so fast at the molecular level that they are more or less in at least local thermodynamic equilibrium while the measurement process is just warming up in preparation for its task. Likewise for the number of fingers and toes: the counting goes 1, 2, 3, ...., too many to count. No one is considered who has enough fingers and toes to count the molecules. You might say that we also require "separation of number of variables", just as we do for time scales. The intermediate numbers of variables, between say 102 and say 1010, just aren't imagined or allowed to be considered. If one wants to consider them, then one is banished from the macroscopic world of thermodynamics to another world that is called mesoscopic, or to a small-ensemble branch of statistical mechanics that does not go near the thermodynamic limit. There the flucutation formulas are different. We just don't go there. You, like perhaps at times Count Iblis (I hope I don't misrepresent him) seem to want to put things in such general terms that the macroscopic, the mesoscopic, and the microscopic viewpoints are all comprehended in a single scheme of presentation. That is logical but not not easily done, and mostly people don't try to do it; they, and we here, usually settle for just one of the three for any particular presentation.Chjoaygame (talk) 23:57, 27 March 2012 (UTC)
undid good faith edit
The English language does not require an article (such as 'the' or 'a') by automatic default. An article is not needed here, and it is idiomatic not to use it in this case.Chjoaygame (talk) 10:56, 24 March 2012 (UTC)
- For comparison, what you want
- Quantity of heat transferred can be estimated ...
- verses what you say is wrong
- The quantity of heat transferred can be estimated ...
- I disagree. Q Science (talk) 13:24, 24 March 2012 (UTC)
- Q Science is right. Compare the sentences:
- "Tree grows."
- "The tree grows."
- Only the second sentence is proper English. This is despite the fact that I find the first sentence more concise and clear. Many Chinese drop the unnecessary "the"'s when speaking English, which I think is more efficient, but it's not good English to speak that way.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 14:53, 24 March 2012 (UTC)- Yep. Chjoaygame, if you are Russian or Chinese or something, stick to the math and content, and leave the grammar on en.wiki to native speakers. "Quantity of heat transferred can be estimated..." is not completely incorrect, and could as easily be "Amount of heat transferred..." There are a few words of this type that don't absolutely need an article. But putting the article in, is never wrong, and is smoother. SBHarris 17:47, 24 March 2012 (UTC)
- If we're talking about [7] I can't see it matters a lot either way. I think C may be technically right, but using the "The" reads more naturally William M. Connolley (talk) 19:59, 24 March 2012 (UTC)
- Yep. Chjoaygame, if you are Russian or Chinese or something, stick to the math and content, and leave the grammar on en.wiki to native speakers. "Quantity of heat transferred can be estimated..." is not completely incorrect, and could as easily be "Amount of heat transferred..." There are a few words of this type that don't absolutely need an article. But putting the article in, is never wrong, and is smoother. SBHarris 17:47, 24 March 2012 (UTC)
- Q Science is right. Compare the sentences:
Thank you for your comments. I think that nature and art agree here. The sentence in question is "Quantity of heat transferred can be estimated by ..." The word 'the' would point to an instance already defined, which does not exist for this sentence. The word 'a' would be permissible, because this sentence might be regarded as creating an instance, but, I think unhappily, it would seem to demand a further 'a' for the "direct measurement" and perhaps for the "heat" and perhaps a "some" for the "calculations". Better with no article, I feel.Chjoaygame (talk) 20:31, 24 March 2012 (UTC)
- The quality of mercy is not strained.
- It droppeth as the gentle rain from heaven
- Upon the place beneath. It is twice blest:
- W. Shakespeare (The Merchant of Venice)
- COMMENT: The quality of good writing is not strained, either. It droppeth as the gentle rain from Heaven upon that ear that hath discernment to hear it. SBHarris 18:40, 25 March 2012 (UTC)
- SBHarris, this still worries you, I see. I am sorry. You have got me worried about it too. When one starts to worry about such a thing, one's feeling about what is natural can become unsettled, the intellect irradiating into and straining the habitual intuition. I have proposed above that the word 'a' would be permissible, but I have jibbed at 'the'. You seem to have said that 'amount of heat transferred' would not demand a 'the', and you have said that what I wrote is not completely incorrect. I suppose I will continue to be worried by the question of whether I should soothe by putting in an 'a'.Chjoaygame (talk) 20:22, 26 March 2012 (UTC)
reason for undoing redirect
I have undone the redirect by Kmarinas86 of 19:41, 24 March 2012. That redirect was not nearly adequately justified and might be considered as some kind of misbehaviour. A redirect like that would need plenty of notice and consensus.Chjoaygame (talk) 19:58, 24 March 2012 (UTC)
- I agree. I've drawn it to the attention of an admin User_talk:NuclearWarfare#Heat William M. Connolley (talk) 20:06, 24 March 2012 (UTC)
- The claim of inadequate justification does not itself seem justified.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 20:52, 24 March 2012 (UTC)
- At least "the claim of inadequate justification" is on the talk page, and is not an edit of the article. But I think the claim is justified.Chjoaygame (talk) 21:00, 24 March 2012 (UTC)
- The claim of inadequate justification does not itself seem justified - it seems rather well justified to me. You proposed the split at 18:50, 24 March 2012; within 30 mins, someone had said no. 18 minutes later you went ahead anyway [8], having made the rather bizarre and hard-to-read-as-good-faith comment Your job now is to delete them. Enjoy. You certainly did not leave time for adequate discussion before acting, and when you acted, the only other comment was opposed William M. Connolley (talk) 21:25, 24 March 2012 (UTC)
- At least "the claim of inadequate justification" is on the talk page, and is not an edit of the article. But I think the claim is justified.Chjoaygame (talk) 21:00, 24 March 2012 (UTC)
Another edit reverted
http://en.wikipedia.org/w/index.php?title=Heat&diff=483739254&oldid=483738800
I was told to see the talk page about this one, but I see nothing here about it yet.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 20:50, 24 March 2012 (UTC)
reason for undoing of good faith edit
I don't think the edit that read "Estimates of the transfer of heat can be conducted ..." is better. The added word "conducted" serves only a grammatical function, not a substantial one; likewise for the added word "methods". And the removed word "quantity" gave direct continuity with the preceding sentence. And the subject of the sentence was made to be "transfer of heat" instead of "quantity of heat transferred", which is not an improvement, because the key idea here is "quantity".Chjoaygame (talk) 20:54, 24 March 2012 (UTC)
request for comment on edit for meaning of heat
There is a new edit of the lead of the article on heat. The new edit is concerned with the possible meanings of the word heat and how they should appear in the article. There arise concerns about the present-day and about classical physical principles, and about historical readings.
The newly edited words are
- According to some authorities in physics, chemistry, engineering, and thermodynamics, heat can only be energy produced or transferred from one body, region, set of components, or thermodynamic system to another in any way other than as work.[1][2]
- However, historically, when the modern theory of heat was developed, heat was regarded as internal kinetic energy of molecular movement or what Joule called living force, Clausius called vis Vitae and Kelvin and others called a dynamical force. Heat was therefore a force in a body that could do work, and there was no need for a separate term to describe thermal energy.
The words which the new edit has replaced are
- In physics, chemistry, engineering, and thermodynamics, heat is energy produced or transferred from one body, region, set of components, or thermodynamic system to another in any way other than as work.[1][2]
- ^ a b Bryan, G.H. (1907). Thermodynamics. An Introductory Treatise dealing mainly with First Principles and their Direct Applications, B.G. Teubner, Leipzig, page 47.
- ^ a b F. Reif (2000). Fundamentals of Statistical and Thermal Physics. Singapore: McGraw-Hll, Inc. p. 67. ISBN 0-07-Y85615-X.
{{cite book}}
: Check|isbn=
value: invalid character (help)
The questions at issue are whether this new edit should stand as it is in the lead, and whether its content or similar content should be entered somewhere in the body of the article, from which a summary entry might be put into the lead, and what sourcing is needed.Chjoaygame (talk) 07:18, 19 March 2012 (UTC)
- Since the above request note was posted, the questioned edit has been further edited. The edited words now read
- According to some authorities in physics, chemistry, engineering, and thermodynamics, heat can only be energy produced or transferred from one body, region, set of components, or thermodynamic system to another in any way other than as work.[1][2]
- However, historically, when the modern theory of heat was developed, heat was regarded as internal kinetic energy of molecular movement or what Joule called living force, Clausius called vis Vitae and Kelvin and others called a dynamical force. Planck[3] described temperature as degree of heat, or state of heat, and distinguished temperature, or degree of heat from quantity of heat. Heat was therefore a force in a body that could do work, and there was no need for a separate term to describe thermal energy.
- ^ Bryan, G.H. (1907). Thermodynamics. An Introductory Treatise dealing mainly with First Principles and their Direct Applications, B.G. Teubner, Leipzig, page 47.
- ^ F. Reif (2000). Fundamentals of Statistical and Thermal Physics. Singapore: McGraw-Hll, Inc. p. 67. ISBN 0-07-Y85615-X.
{{cite book}}
: Check|isbn=
value: invalid character (help) - ^ [1]Max Planck Nobel Laureate, 1918 "Treatise on Thermodynamics.| Part I. Fundamental Facts and Definitions Chapter I. TEMPERATURE. Page 1. "1. The conception of "heat" arises from that particular sensation of warmth or coldness which is immediately experienced on touching a body" "For quantitative result we use the change of volume" "a purely mechanical observation affording a much greater degree of accuracy. We can also tell accurately when a body assumes a former state of heat" "2 If two bodies, one of which feels warmer than the other, be brought together....it is invariably found that the hotter body is cooled" "From this follows the important proposition: If a body, A be in thermal equilibrium with two other bodies, B and C, then B and C are in thermal equilibrium with each other" Page 2 3. These facts enable us to compare the degree of heat of two bodies, Page 34 "Chapter III Quantity of heat 44.If we plunge a piece of iron and a piece of lead, both of equal weight and at the same temperature (100C) into two precisely similar vessels containing equal quantities of water at 0C, we find that,....the vessel containing the iron has increased in temperature more than that containing the lead." "This phenomenon leads to a distinction between temperature and quantity of heat."http://books.google.fi/books?id=kOjy3FQqXPQC&printsec=frontcover&dq=treatise+on+thermodynamics&hl=en&sa=X&ei=2-1mT4U_wvrhBLKGtakI&redir_esc=y#v=onepage&q=treatise%20on%20thermodynamics&f=false
- Following immediately on the edited words, the article has the words:
- ^ Oxford English Dictionary, second edition, Oxford University Press, Oxford, UK.
- ^ Planck, M. (1897/1903), p. 1, "This direct sensation, however, furnishes no quantitative scientific measure ..."
- ^ Truesdell, C. (1980). The Tragicomical History of Thermodynamics 1822-1854, Springer, New York, ISBN 0–387–90403–4, page 15: "What they meant is not always clear."
- ^ "A review of selected literature on students' misconceptions of heat" (PDF). Boğaziçi University Journal of Education. 20 (1): 25–41. 2003.
- ^ Brookes, D.; Horton, G.; Van Heuvelen, A.; Etkina, E. (2005). "Concerning Scientific Discourse about Heat" (PDF). 2004 Physics Education Research Conference. 790. AIP Conference Proceedings: 149–152. doi:10.1063/1.2084723.
- The new version of the edit is still subject to the request for comment.Chjoaygame (talk) 09:33, 19 March 2012 (UTC)
Rethink I decline to comment on the preferable form of the definitions etc and their relative equivalence or superiority. I do however object to the discussion of the history or any other details that do not directly assist in the immediate understanding of what the core subject matter of the article is intended to cover. You could say what heat is, as the first paragraph does, and you might add brief statements of what heat is not (with suitable links to terminology). You might for example say briefly in what way heat is not not temperature or internal energy etc, but that is that. All the other stuff, if you do not think that it is good to confine the discussion to respective related sections, can go into an introductory overview section. Most of the current lede belongs in such an overview section if it is to be retained at all. Keep the lede short enough and specific enough to satisfy anyone who elects not to read the article after having read the lede, and informative enough to keep anyone reading, who has any functional interest or curiosity concerning the matter. All that other stuff should be kept out of the lede. IMBNMHO JonRichfield (talk) 19:24, 10 April 2012 (UTC)
- Thank you for this comment. I will bear it in mind. The article needs reform, but I hope it will be measured.Chjoaygame (talk) 01:50, 11 April 2012 (UTC)
Heat and temperature and pressure of one particle
- Sbharris, how am I wrong? Heat can be the energy of the smallest particle (have you checked what the Boltzmann constant is?). Have you never heard of a 'hot electron' effect or device, such as a Flash memory? In any physical system of particles there is no requirement for the particles to have any defined energy, that is the importance of the equation from Reif has:- 1/2mv2 = 3/2KBT. What it means is that each particle has a temperature corresponding to its individual energy, that is the basis of statistical mechanics, as is the random distribution of particle energy at equilibrium! --Damorbel (talk) 21:35, 24 March 2012 (UTC)
- I repeat myself. The v in your equation isn't the velocity of any single particle, but only a mean value ⟨v⟩ of many particles. The book you got it out of, will tell you that. If you put in "temperature" on one side, which has meaning only as a mean value of kinetic energies, and thus you get a mean value of velocities on the other side whenever you use T. This equation never applies to single particles, and Boltzmann constant does not imply that it does. Remember me telling you that just because the U.S. has an average income per person, does not mean that this NUMBER has ANY meaning for any ONE person? And yes, there are people who speak loosely of hot electrons and hot atoms, but it is bad physics to do so. It is rather like people speaking of mass being converted to energy. Strictly speaking it never is (matter is, mass is not), but even physicists can be loose with language. SBHarris 18:57, 25 March 2012 (UTC)
- Sbharris, how am I wrong? Heat can be the energy of the smallest particle (have you checked what the Boltzmann constant is?). Have you never heard of a 'hot electron' effect or device, such as a Flash memory? In any physical system of particles there is no requirement for the particles to have any defined energy, that is the importance of the equation from Reif has:- 1/2mv2 = 3/2KBT. What it means is that each particle has a temperature corresponding to its individual energy, that is the basis of statistical mechanics, as is the random distribution of particle energy at equilibrium! --Damorbel (talk) 21:35, 24 March 2012 (UTC)
- Sbharris you write "The v in your equation isn't the velocity of any single particle". Why ever not? Are you saying that there is some kind of limit below which a single particle does not have an identifiable velocity? I think kinetic theory makes it very clear that particle energy is independent from its size (mass), and the theory behind Johnson noise makes it abundantly clear that this extends to electrons. Johnson noise theory is perfectly good physics, currently it is being uased to redefine the Boltzmann constant. Do you not accept the arguments for Johnson noise, do you maintain that it's 'bad physics? --Damorbel (talk) 21:13, 25 March 2012 (UTC)
- The v is the statistical mean velocity with respect to the center of mass. Many individual molecules have higher and lower velocities, but they all, as a group, have the same temperature. Trying to assign a temperature to a velocity just does not work. It only makes sense when associated to the mean velocity of a larger number of molecules after the velocity of the center of mass is removed. Q Science (talk) 21:37, 25 March 2012 (UTC)
- Yes. And the reason "why ever not?" is that particles in a single object of a given stable (single) temperature have a distribution of velocies, from low to high (in a gas, this is the Maxwell-Boltzmann distribution). Just like people rich to poor can be found in a country with a single "mean income." SBHarris 23:57, 25 March 2012 (UTC)
- Given how potential energy must be treated as a function of at least two entities (particles, waves, and/or fields), it is strange that the same is not done so for kinetic energy. Perhaps if kinetic energy was defined in terms of a relationship between at least two particles, waves, and/or fields, then it might be argued that for a single particle, wave, or field to have kinetic energy unto itself makes as much (or as little) sense as it having its own potential energy, without any other entity relating to it. Maybe "temperature" is something characteristic of the rate change of fields manifested by particles in relative motion, as opposed to motion with respect to some arbitrary reference.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 00:29, 26 March 2012 (UTC)
- Given how potential energy must be treated as a function of at least two entities (particles, waves, and/or fields), it is strange that the same is not done so for kinetic energy. Perhaps if kinetic energy was defined in terms of a relationship between at least two particles, waves, and/or fields, then it might be argued that for a single particle, wave, or field to have kinetic energy unto itself makes as much (or as little) sense as it having its own potential energy, without any other entity relating to it. Maybe "temperature" is something characteristic of the rate change of fields manifested by particles in relative motion, as opposed to motion with respect to some arbitrary reference.siNkarma86—Expert Sectioneer of Wikipedia
No, kinetic energy in systems behaves much more like potential energy, since more than two particles not moving in the same direction are involved, and this results in an invariant quality that includes some kinetic energy (unlike the case with single particles). The system gets a certain invariant mass which is related to the system/object's COM frame, exactly as with a potential energy system (where potential energy stored in the system adds to invariant mass also). The kinetic addition in systems happens even in systems with a temperature and no potential energy at all, like a (massless) bottle of hot monatomic ideal gas (for example). Kinetic energy (KE) systems have a certain minimal KE in their COM frame, and you can't go below that, so it's not entirely relative, once you get a system of more than two particles in different directions. And the COM temperature is the maximal object temperature from any reference frame, also. However, temperature, unlike invariant mass is not Lorentz invariant. Temperature is a scalar, and I know of no invariant scalars, except for invariant mass. There is an invariant "heat 4-current," with 3 components that are the usual spacial-direction heat flows/currents, and the time component is the thermal energy in the COM frame. But I don't know what the 4-vector equivalent of this is, for temperature. SBHarris 00:50, 26 March 2012 (UTC)
- "No, temperature behaves much more like potential energy, since more than two particles not moving in the same direction is involved"You said "No" but right after that, you said something that is basically along the same lines of what I'm suggesting. Why did you do that?siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 01:02, 26 March 2012 (UTC)- Because I screwed up and said, "temperature" when I really meant to write "kinetic energy in systems". Sorry. Kinetic energy only behaves oddly for single particles. Get two or more, and it starts to behave like potential energy. And when we talk about temperature, of course we alway have plenty of particles enough, to make KE look like potential. SBHarris 16:00, 26 March 2012 (UTC)
This discussion is merely repeating the discussion about the energy of a single particle at the Boltzmann constant [[9]]. And if anyone wishes to insist that the expression from Reif :- 1/2mv2 = 3/2KBT 'doesn't apply to a single particle', I expect them to explain what minimum number of particles is needed to make it valid, at present I am quite unable to see how this minimum number can be different from '1'. --Damorbel (talk) 06:08, 26 March 2012 (UTC)
- If there is only one atom, then there will be no collisions and its temperature is undefined. Q Science (talk) 20:27, 26 March 2012 (UTC)
- Right after you explain what the minimum number of people in a Gallup poll are needed to make it "accurate." What % of people will vote for Obama in 2014? When you get that, you'll get the other problem. Actually the Gallup poll is more complicated. You need to find any statistical sampling problem in which you already know the distribution beforehand, and have to ask how many samples you need to determine the distribution curve and its mean, to a certain tolerance. That is it. The temp is the kinetic energy mean of the pool. So, what size need your sample be? SBHarris 16:00, 26 March 2012 (UTC)
- Really Sbharris! How can you make such a comparison between the random selection procedure needed for a Gallup poll and the random outcome of a Maxwell-Boltzmann distribution? There can be no relevant connection and I am only too happy that you haven't tried to charm us all with one. No, although the Maxwell-Boltzmann distribution is relevant to an instantaneous image of the energy distribution, the long term average energy (and temperature) of every particle of the sample is the same as the average energy of the Maxwell-Boltzmann distribution. So, long term, the particles all have the same energy and temperature, it is just that energy (and temperature) of each particle fluctuates rather wildly in the short term because of the collision process, get it? --Damorbel (talk) 18:21, 26 March 2012 (UTC)
- I do. But one way or the other, this is an averaging process, and you must follow one particle for a long time, or a larger number for a shorter time to get a good number, and even then, it's never exact. It just gets closer and closer to the true mean, as particle number gets larger, or time gets longer (the product of time and number increases). The velocity of any one particle at any one time, tells you nothing. Once again, when you see T on one side of the equation, the velocity on the other is some kind of average. You can pick out one particle to follow, but without it impacting thousands of times on other particles in a distribution, it will not tell you the temperature. So you can't "pretend" that the others are not there, and that you're looking at "temperature" of only the one. You're not. If you look at the "one" after 4000 impacts with others, you're actually looking at 4000 particles. Taking a poll. Doing a statistical sample. SBHarris 22:39, 26 March 2012 (UTC)
- I suggest that your understanding of the matter has certain shortcomings. If you want to know the average temperature of a particle you can indeed take average of its energy over a period of time and, if everything else is unchanging you will get a number that is indeed the 'average' particle energy (temperature), with an accuracy that improves with time. But the average number is, more often than not, unimportant; what matters at the atomic/molecular level is the short term value of the energy, e.g. the energy a particle has at the time of a collision or chemical reaction is happening and perhaps a quantum interaction; these are also important. It is quite plain, or should be, that if the particles have a uniform energy e.g. a particle beam, some interactions simply will not take place because there are no particles with enough energy to make them happen. For example these facts are quite essential to an understanding of quantum interactions. --Damorbel (talk) 05:57, 27 March 2012 (UTC)
- So? All this I understand and it's irrelevant. At many temperatures only a small fraction of particles have enough energy to make the transition to a chemical reaction, just as happens in nuclear fusion too, for that matter. This does not change the definition of "temperature", which requires a collection of particles and is a mean (average) property of a group. Look it up, okay? Tell me what your nearest thermo text says about it. [10] SBHarris 09:21, 27 March 2012 (UTC)
- You write:- "At many temperatures only a small fraction of particles have enough energy to make the transition" What do you mean by this? Why should 'only a small fraction.... etc.' only have enough... '. There is no requirement for this restriction at all. Nothing in what you write explains your (apparent) argument that a number (that you do not define) of particles are required before a temperature can be 'assigned'. How is it that this (unstated) number is important? I mean 1 mole of (pure) water freezes at the same temperature as 1 gram, or have I not understood you correctly? --Damorbel (talk) 10:23, 27 March 2012 (UTC)
- Oh dear, Sbharris, I have just read your reference above and it says:- "Temperature is a measurement of the average kinetic energy of the molecules in an object or system and can be measured with a thermometer or a calorimeter. It is a means of determining the internal energy contained within the system." Nothing about equilibrium. What about an iron bar 'system' with 100K at one end and 500K at the other. Is it safe to pick it up with bare hands 'Because its average temperature is 300K'? And you know it is safe 'Because you measured its temperature with a calorimeter? Is this a 'reliable source', suitable for Wikipedia? --Damorbel (talk) 11:35, 27 March 2012 (UTC)
Yu began talking about chemical reactions, and the transition I was talking about is the one in the transition state theory of chemical reaction rates. See specifically collision theory where the Maxwell-Boltzmann distribution is used to deduce the fraction of molecules at a given temperature that have the required kinetic energy to reach the activation energy and thus reach transition state and go on to undergo the chemical reaction. Usually, it's just a tiny fraction of molecules in the "fast tail" of the M-B distribution.
Your statement above says nothing about equilibrium but obviously assumes it. An object with two temperatures will obviously have an internal heat flow, and this is an important subject in heat transfer and engineering. See heat equation.
Finally, for examples of how temperature in systems of small numbers of particles depends on the number of particles, even with total energy and average total energy fixed, see canonical ensemble. Also there is one example of a formula for small numbers of particles which depends on time-averages here: [11]. Note that the general temperature of the system only approaches what we normally think of as the temperature, in the limit that the number of particles becomes "large." Finally, note that the equations for 1/2mv^2 = 3/2kT all rely on v_rms, which means "root mean square." All your texts should have the same. SBHarris 16:11, 30 March 2012 (UTC)
- When n = 1, vRMS = v, try it and see!--Damorbel (talk) 18:56, 30 March 2012 (UTC)
- Sorry, but the equation 1/2mV_rms = 3/2 kT isn't valid for one molecule; it's only valid for a huge number. Again, velocities of molecules in a gas at any instant, vary widely. Do you really think that every molecule in a bottle of gas, from fast to slow, has its own unique temperature? No. You simply have no concept of "termperature." The way you use the word is as ridiculous as imagining that every molecule in a bottle of gas has its own unique pressure. SBHarris 20:52, 30 March 2012 (UTC)
- "isn't valid for one molecule; it's only valid for a huge number". Saying a 'huge number' is not the least bit scientific, you mut have a logical argument why a 'huge number is required. May I suggest that, if you are using a mercury-in-glass thermometer, you will indeed need a 'huge number' of molecules to get a reliable temperature reading but a photon detector might not need so many, by definition of the Boltzmann constant only needs 1 degree of freedom. Oh yes, the 'k' in '3/2 kT' should be 'kB' - the Boltzmann constant. --Damorbel (talk) 06:50, 31 March 2012 (UTC)
- It may be helpful if I go a little deeper into this business of vRMS = v. To determine the average energy of a collection of particles (with different speeds) the procedure is to average their energies, this is exactly the same procedure as for a waveform having frequency componets, the powers in the separate componets is evaluated. Normally what is wanted is the total power so the powers in the different frequencies is just summed. For temperature the sum of the energies, proportional to v2, in the different particles is divided by the number of particles, see RMS, where x is replaced by v. From this you can see why the RMS energy of a single particle can exist and is no different from that of the mean particle energy with any number of particles are in the system. --Damorbel (talk) 17:28, 1 April 2012 (UTC)
- Sorry, but the equation 1/2mV_rms = 3/2 kT isn't valid for one molecule; it's only valid for a huge number. Again, velocities of molecules in a gas at any instant, vary widely. Do you really think that every molecule in a bottle of gas, from fast to slow, has its own unique temperature? No. You simply have no concept of "termperature." The way you use the word is as ridiculous as imagining that every molecule in a bottle of gas has its own unique pressure. SBHarris 20:52, 30 March 2012 (UTC)
No, you miss the point. The reason the 1/2mv_rms^2 = 3/2k_B*T equation isn't valid for one particle isn't because of problems with calculating the v_rms for one particle, but the problems in calculating temperature for one particle. Temperature must have a statistical distribution of energies, and is not valid for one particle which never changes energies. Furthermore, following one particle as it changes energies over time in a system, is essentially sampling a large number of particles of different energies in a thermal system that it impacts with, and thus (again) you are (indirectly) measuring a thermal distribution, by sampling it. The variance of such a set of samples tells you something you need to know, to even DEFINE temperature. Why? Because for any temperature above absolute zero you must have an entropy-change associated with the thermal energy change for the temperature, and you cannot do that with one particle, which (in isolation) has an entropy of zero, no matter how fast the particle is moving. S = k_B ln (1) = 0. Thus, you can increase the energy of one particle and its entropy does not change, and is still zero, so dQ/dS = dE/dS = T = undefined since dS = 0. Which makes your proposed definition of T as the KE of one particle untennable. You disagree with thermodynamic definitions of T.
IOW, the "logical argument" why a huge number of particles is required" (at a thermal distribution of different energies) to define a temperature to some narrow value, is that the laws of thermodynamics don't work unless temperature is defined in terms of heat input (or thermal energy content change) per unit of entropy-change, and entropy requires disorder, and disorder requires a system of particles, not one particle.
For an example of the converse, consider my example of a perfect crystal cooled to absolute zero, then made somehow to travel at the v_rms speed of an air molecule (you could accelerate the entire thing, or merely change reference frames). Does it make sense to now speak of the "temperature" of all these fast atoms, since each is now moving at the (exact same) high velocity? You say yes. But the correct answer is no!, for this makes hash of all thermodynamics equations that involve temperature and entropy and energy. If such an emsemble has a "temperature" then none of the laws of thermodynamics regarding temperature and work-cycle efficiency are valid. Since such a crystal has zero entropy, then the kinetic energy of such a crystal, like the kinetic energy of a single atom in flight, can be converted to useful work with 100% efficiency. In a Carnot cycle such a temperature of such a mass therefore doesn't look like room temperature for such a mass, but rather it looks like the absolute zero that it (in fact) is. Thus, your proposed definition of temperature is WRONG and nobody agrees with you. Sorry, but you're just being ignorant of the basics of Thermo 101 here, and I would ask you to stop editing this article until you educate yourself. SBHarris 17:22, 2 April 2012 (UTC)
- Sbharris, you write about "the problems in calculating temperature for one particle". Then do you not accept the reasoning behind the Bolztmann constant kB? The connection it makes is between energy and temperature; if you read about the Boltzmann constant you will find that is about just one particle, just like Reif's formula 1/2mv2 = 3/2KBT. It would seem that you take no account of the Boltzmann constant, since you insist, without reason or scientific argument that "Temperature must have a statistical distribution of energies." Can you give any support to this statement? For example, why does a particle "which never changes energies" (energy?) not have a temperature? No one is denying that particles in a gas are constantly changing energy through collisions but, according to Reif's formula, it is the energy, not the collisions that makes the temperature, Reif's formula has nothing about collisions in it.
- I suggest you check out these links the Relationship of Entropy to Temperature , Temperature and Second Law: Entropy. The Boltzmann constant and entropy both have the same dimensions Q/T, the difference being that entropy can be used as a measure of disequilibrium e.g. a system with more than one temperature whereas temperature is unique, as system can only have one temperature and that is when it is in equilibrium (a single particle 'system' is always in equilibrium).
- You make a point about particles moving in different frames of reference. My initial response is that such a system is, by definition, not in equilibrium so it is not possible to assign a temperature to it. --Damorbel (talk) 17:36, 3 April 2012 (UTC)
- Nope. You can take a system in equilibrium by all definitions and to the satisfaction of everybody, then look at it from another reference frame (as a moving observer would). Per Einstein, the same laws of physics must apply to it. Unfortunely, kinetic energy and temperature are observer-dependent. That is okay, since the operations we really care about in thermodymics involve ratios of temperatures and energies, which are observer-independent. If entropy were observer-dependent, then the laws of thermodynamics would collapse, as they would not be Lorentz invariant, as all physical law must be. Fortunately, this is not the case. The kinetic energy of atoms in a moving perfect crystal is a good example of why temperature cannot be simply "kinetic energy in one direction," and you should think about it (you haven't). This also explains why temperature of single particles makes no sense, as single particles can only travel in one direction. I can extract 100% of the kinetic energy of a single particle moving in one direction, and turn it into work (I can simply let it strike a piston). If that energy has a "temperature" then the laws of thermodymics are wrong, since these limit the work I can get by cooling/slowing anything with a "temperature." You pick me a single particle and I can find a reference frame for which it has no kinetic energy and thus a temperature of zero in your system. I can also find a reference frame or observer for which a collection of atoms in a crystal with zero entropy in its own rest frame, has a temperature as large as you like, which makes no thermodynamic sense, as its entropy is the same in all frames. This, this notion is wrong, and you are wrong. Your references from hyperphysics above actually don't say what you think they do, and you should look at them again. Your Temperature definition says: A convenient operational definition of temperature is that it is a measure of the average translational kinetic energy associated with the disordered microscopic motion of atoms and molecules. Read that three times. Or you could look in the WP article on thermodynamic temperature which give the meaning of your equation with the Boltzmann constant: The thermodynamic temperature of any bulk quantity of a substance (a statistically significant quantity of particles) is directly proportional to the mean average kinetic energy of a specific kind of particle motion known as translational motion. Read that again, too. SBHarris 18:05, 3 April 2012 (UTC)
- I'm still wish to know how many particles make a 'huge' number.
- In chemical reactions only the particles that are hot enough i.e. have enough energy can initiate a given reaction. A good example is the ozone layer in the stratosphere. The ozone layer arises because O2 is split by photons with an energy >4.4eV (λ<280nm). Now only one photon is needed to split an O2 molecule and it comes from a source with a temperature of 5780K (the Sun). But the Sun produces a vast number of photons with energy >4.4eV, Is it your argument that these photons (the single ones that split molecules) don't have a temperature?
- Um er, please don't forget to say what a huge number is.
- Oh, and one more thing, these photons are all comming from one direction so they are not random, they have net momentum (away from the Sun) but somehow they warm the Earth. --Damorbel (talk) 20:39, 3 April 2012 (UTC)
That is correct. Individual photons do not have a temperature. An energy, yes; a temperature, no. The photons from the Sun have an equivalent temperature only because they come with a spread-out blackbody spectrum at many frequencies, so we can assign them a temperature of 5780 K and after thermalizaton thermodynamically they act like heat at that temperature. If they all came with the same frequency, they also would have no temperature. A laser beam has no temperature and no entropy.
Unlike sunlight, a laser beam could be converted into work with 100% efficiency, even without an entropy dump. Sunlight cannot. Because of entropy, efficiency of conversion to usable energy on Earth from sunlight is never better than (5760-300)/5760 = 95%. That's far better than solar cells or plants actually do, so we don't notice the themodynmic limitation, but it's there, all the same. Since the entropy of the Sunlight energy has to go somewhere when you convert it to work, at least 5% of it must wasted, making heat at 300 K (Earth temp). All this doesn't happen with laser beams or monochromatic light. If we were able to dump heat into space at 2.7 K our efficiency could go up to (5780-2.7)/5780 = 99.95%. But still not 100%. Once Sunlight has been thermalized on the Sun, you can't ever get all its energy back. Some of it always has to stay as thermal energy (randomized energy) at some lower temp. That's the second law of thermodynamics. THe second law is the whole idea behind temperature. It's not only kinetic energy that defines temperature, but (much more importantly) the fact that the energy has been randomized in direction and amount. Once spread out in phase space, it cannot be put back. SBHarris 21:27, 3 April 2012 (UTC)
- I have an idea. Why not absorb each photon separately? That way, your theoretical maximum efficiency would be greater than 95% or even 99.95%. "Entropy" as they teach in thermodynamics doesn't really apply to quantum systems. "Temperature" is but a statistical perversion of peak frequency (though they are somewhat proportional), and "entropy" is but a statistical perversion of angular momentum (and dare I say that they are somewhat proportional as well)! The photon has its intrinsic angular momentum preserved. There is no "entropy" in a temperature sense occurring outside the carriers of force. Any entropy between two photons is "non-thermodynamic". This means that the entropy of photons cannot be subject to increases the way that mass-bearing elementary particles are. Thermodynamics is a science of a lesser kind than particle physics - a mere approximation.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 00:32, 4 April 2012 (UTC)- Don't tell any particle physicists that thermodynamics is a science of a lesser kind-- they'll just know by that, that you haven't looked into the marriage of the two. It turns out that thermodynamics and its laws survive both relativity and quantization (not surprisingly, as light quanta were originally invented to explain heat and black body radiation). In QM, the standard Gibbs-Boltzmann entropy becomes the von Neumann entropy, and loss of information about a system becomes associated with wavefunction collapse (though the information remains, it becomes more spread out in phase space, which itself is quantized in unit "voxels" of h3). In fact, if you look at Weyl quantization you can get a hint of the idea that any quantum mechanical system can be represented as well by a distribution in phase space as by a vector in Hilbert space. The math here is well beyond me, but better men than I have tried to find a quantum mechanical way to have some kind of quantum version of Maxwell's demon, and utterly failed. SBHarris 02:29, 4 April 2012 (UTC)
- Information entropy#Relationship to thermodynamic entropy states: "Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total thermodynamic entropy does not decrease (which resolves the paradox)."I see at least one problem with this conclusion. Here, they assume that "new entropy" must exist in order to make the "Maxwell demon" viable. But all it really seems to prove is that "further entropy" must exist of that amount required. That entropy may already exist in a level of matter unavailable to our current technology, hidden inside the endless of depths of matter, beyond the nanoscale, the picoscale, femtoscale, etc. In fact, something tells me that what we see with information entropy is simply the dispersal of information to where they exist at the quantum level and above, where they become observable. There may be an unbounded set of entropies within matter. It appears that as long entropy from the boundless small remains, then the system may be reversible.
- It might not only be the boundless small that could provide such reversibility. What about the boundless large? A photon by itself, without matter or other electromagnetic fields to interact with cannot diffuse easily. It should be possible to concentrate photons or diverge them at will using reflective of refractive materials, either reclaiming or losing energy, so as long as the effective temperature of the background radiation is low enough to permit non-scattering of photons at certain frequencies. For example, radio waves will not scatter so easily in vacuum of space because the effective temperature is 3K. Is it really impossible that this energy could not somehow be reclaimed by celestial objects? For sure, large structures would have to exist to reclaim that, much larger in area (with a breath of hundreds millions of light years across each), and probably darker than what may be presently observed. I suppose that a universal structure having a fractal dimension of 2 would suffice to reclaim all those losses, albeit after such losses travel immense cosmological distances, as it would go hand-in-hand with the needs of having ever so-larger structures that may reclaim whatever leaks from a system. I cannot imagine that such a thing is possible except if were for the possibility that the cosmological principle is a flawed assumption. Perhaps the Hubble Deep Field is actually a glimpse into such areas, presently interpreted as the beginnings of the universe, which may actually instead be the outer periphery of dense clusters of matter and light which focus light toward themselves, causing the light to converge rather than diverge, as if it were "time-reversed", creating dense energy areas where light converts back into massive form once again.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 21:53, 4 April 2012 (UTC) - The Bekenstein bound has some relevance to my statements: "Bekenstein derived the bound from heuristic arguments involving black holes. If a system exists that violates the bound, i.e. by having too much entropy, Bekenstein argued that it would be possible to violate the second law of thermodynamics by lowering it into a black hole."Also, at Black hole information paradox#Main approaches to the solution of the paradox:
- Information entropy#Relationship to thermodynamic entropy states:
- Don't tell any particle physicists that thermodynamics is a science of a lesser kind-- they'll just know by that, that you haven't looked into the marriage of the two. It turns out that thermodynamics and its laws survive both relativity and quantization (not surprisingly, as light quanta were originally invented to explain heat and black body radiation). In QM, the standard Gibbs-Boltzmann entropy becomes the von Neumann entropy, and loss of information about a system becomes associated with wavefunction collapse (though the information remains, it becomes more spread out in phase space, which itself is quantized in unit "voxels" of h3). In fact, if you look at Weyl quantization you can get a hint of the idea that any quantum mechanical system can be represented as well by a distribution in phase space as by a vector in Hilbert space. The math here is well beyond me, but better men than I have tried to find a quantum mechanical way to have some kind of quantum version of Maxwell's demon, and utterly failed. SBHarris 02:29, 4 April 2012 (UTC)
“ | Information is stored in a Planck-sized remnant:
|
” |
- siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk 22:52, 4 April 2012 (UTC) - You say, Sbharris, that "Individual photons do not have a temperature." I repeat my question 'How many photons (or energetic massive particles) do you require to establish a temperature? After thermalizaton does not answer it. How many is the question, is it 2? Is it 102? Is it 1022? Is it 10222? You must be able to say how many.
- siNkarma86—Expert Sectioneer of Wikipedia
- Your problem seems to me to be a confusion between entropy and temperature. Try this; a thermal system may comprise a number of subsystems each with its own temperature; the entropy of such a system is less than maximum if the temperatures of the individual subsystems are the same; the entropy of each of the subsystems is at a maximum, otherwise it would not be possible to assign a temperature to it.
- In a similar system where the subsystems are able to exchange energy by random collision, I am thinking of a gas where the subsystems are composite molecules such as hydrocarbons, the molecules will not have the same (individual) energy (temperature) but, if they are in equilibrium, their energies (temperatures) will have a distribution according to the Maxwell-Boltzmann distribution; the system entropy is different from (higher than) the equal temperature condition; in such an (equilibrium) system a temperature can be assigned to the whole system, even though the individual subsystems have different temperatures.
- In such a system the smallest subsystems possible are the individual particles; there is not, according to kinetic theory, any limitation on the maximum size of the particles, they may well be aggregates of other particles, such as pollen grains, the only requirement to be a particle is that all its parts move together during a collision with another particle.
- For the system to have maximum entropy the subsystems need to have equal temperatures, ja oder nein? This would be the case when the subsystems are not exchanging momentum through collisions.
- If the particles are exchanging momentum through collisions a time element is introduced, because of the collisions the energy of the particles takes on a variable character with regard to not only the particle energy but also time, this is called an ergodic process. Because the particles are exchanging energy by this Ergodic process it is possible to extend the concept of single particle temperature (energy) to any number of particles, provided the ergodic process has reached an end state, a state that is also called thermal equilibrium.
- From this you should be able to see that a colection of particles (or photons) with the same energy also have the same temperature but their entropy is different (less) than a similar number of particles exchanging energy by (random) collision. This is the situation with your laser; a laser produces photons with a very small (effectively zero) energy distribution (standard deviation - i.e. they all have the same enegy!) (Yes I know photons don't collide but in this case they are produced by stimulated emission which is why the energies are all the same). If the laser photons had an energy of 4.4eV all of them would be able to split O2 into 2 x O and thus make O3. This might appear as 100% efficient process but don't forget that it is none thermal and no laser can be 100% efficient. --Damorbel (talk) 09:52, 4 April 2012 (UTC)
I'm sorry, but you simply cannot claim that two collections of particles with the same average kinetic energy but different energy distributions (and thus different entropies) have the same temperature. I already gave you an example of a perfect crystal at absolute zereo which you can accelerate in any direction, giving it a lot of kinetic energy and all of its atoms a large velocity (or you can do this by looking at it from a different reference frame), but its temperature will remain zero because its entropy remains zero. All that kinetic energy of all those atoms does not count toward temperature because it hasn't been thermalized. It's a little harder to see why one cannot speak of the temperature of box of gas molecules flying in random directions, all with exactly the same energy, but the basic reason is that this is not a system in equilibrium, yet. As it equilibrates to a Maxwell-Boltzmann energy distribution (thermalizes) its entropy will increase also (although this time, it doesn't start from zero). But the molecules enter more states in phase-space. This is a bit like allowing an already-thermalized gas to expand into a larger volume without doing work. Here, energy (and this time also temperature) stay the same but entropy also increases, as the system reaches equilibrium by expanding in phase space in another way.
The answer to your question of how many particles does it take, has already been answered: more than one. You can calculate a "temperature" for a system of two molecules so long as they are not traveling in the exact same direction, and they have had a chance to thermally equilibrate by hitting or interacting with each other enough times (just as you can calculate an invariant mass (or system mass) for two or more photons also, as long as they are not moving in the same direction). Now, the temperature you get from this will not be a very good one. It will be a number like any sample average, with a very wide confidence limit. Temperatures are sample averages (kinetic energy averages). How good the average is (what its confidence limits are) depends on your sample size. If you sample two people on their yearly income, you get a crappy answer for the country, but at least it's a real answer for "average income." One person will not give you an average income at a given time. If that one person moves around the country and works at 4000 widely different jobs like Mike Roe, however, you can take his mean salary, and start to get a better number for the average income of people who have jobs. And if you sample 4000 people randomly around the country you can get the average income to a percent of the number you get if you got this info from every person. Temperature usually involves systems with such a huge number of particles that we forget the confidence limits on the number, because they are so small. But they do exist, and they show up with small numbers of particles. Okay? SBHarris 18:41, 4 April 2012 (UTC)
- "...cannot claim that two collections of particles with the same average kinetic energy but different energy distributions..." There are relatively few equilibrium (maximum entropy distributions) but a truly vast number of distributions with the same 'average energy' as a maximum entropy distribution; think of half the particles at 0K and half with 2x average, or a quarter with 4 x the average energy; the list, even in the grossly oversimplified example, is endless.
- No. The number of states of a system of finite volume, is finite. This is because the states are quantized, so there are a countable number of them, because a system of finite volume can only have a finite number of modes (energies) for each particle. For example, a hydrogen atom only has one microstate at its ground-state energy. Entropy is S = klnW where W is the number of microstates, So you can see that since log(1) = 0, a ground state hydrogen atom has no entropy and no temperature. This, despite the fact that the electron has some kinetic energy, which proves your thesis wrong immediately. Presto. One particle in a box of any finite size and any given total energy in the box, has only one quantum state (I presume no external field to couple to spin, etc, etc, just as with hydrogen). Any other state requires a different energy. If an electron is stuck in a box at other than ground state energy it can always go to ground and emit the energy as a photon, but now you have TWO particles in the box-- electron and photon. Now they can trade off energies in a two-state system, and now you can have an entropy S = kln2 and even a temperature. But it doesn't work for just one particle (at one volume and one system energy). And I'm sorry I misled you about two particles-- this is not true unless they are confined (which is usually a given). With an infinite volume you have an infinite number of states since there is no quantization. How much volume is in the universe is a sticky question, but fortunately not one we need to consider here. Our particles on Earth are confined effectively to definite volumes, like bottles or planetary atmospheres. And before you tell me that the numbers at those volumes are so huge that they may as well be infinite, look at that log term. That cuts large numbers of microstates way down to size, especially considering that you then multiply by Boltzmann's k, which is a very small number (R/N where N is Avogadro's number). It all actually does come out to entropy S numbers that are what we measure. Also, T does come out to be dQ/dS. However, T is also not defined except in systems of more than one microstate, and for that you need two particles at any given total energy and any given total volume. As noted. Again, one particle cannot switch between two energy states (quantum states) in one system without putting its excess energy into another particle. Two state systems of one particle where a temperature is calculated (these exist in one of the cites above) are open systems where energy is allowed to escape or enter.
- "...does not count toward temperature because it hasn't been thermalized..." What limit do you place on system size by this requirement for thermalisation? This is exactly the same question I am still expecting to be answered; how many particles must your system have before you accept it can have a temperature? is it 2? Is it 102? Is it 1022? Is it 10222? You must be able to say how many. Thermalisation means that all the particles have an energy distribution with Maxwell-Boltzmann statistics, only then can you give the system of particles a temperature, that is the meaning of thermalisation.
- The Maxwell-Boltzmann distribution works with any number of states larger than 1. You need an energy difference E between states and then you have an Aexp-(E/kT) distribution of probabilities, and you can calculate. No energy difference, no distribution.
- "The answer to your question of how many particles does it take, has already been answered: more than one." Do you have a link for 2 particles? Don't forget these particles are exchanging energy (or not) through collisions with each other or container walls. I am having difficulty with understanding what the difference is between 1 particle and 1 + 1 particle.
- Here is your link. Educate yourself. [12]. Remember you cannot have more than one microstate with only one particle in one volume at one energy.
- "One person will not give you an average income at a given time" Why not? What you will not be able to calculate is a standard deviation which requires a difference between two particles but an average has a rather useless validity, even with only one particle; the energy value it gives is exactly the same as for multiple particles.
- That is why temperature cannot be calculated only from a total energy. You also need number of microstates and ways that energy can be partititioned and distributed. With only one particle you're stuck with only one way, and thus no temperature. With two particles you can move it around to as many quantum modes (many of which count as their own microstate if the particles are distinquishable) as your volume will accomodate for the total kinetic energy that these two particles can trade off to each other, at the same total energy. The link gives a thorough and detailed summary of how the counting of microstates is done.
- "Temperature usually involves systems with such a huge number of particles that we forget the confidence limits on the number...". Not in molecular interactions it doesn't. Molecular interactions are exactly dependent on the individual particle temperatures at the time of the collision or any other process.--Damorbel (talk) 21:15, 4 April 2012 (UTC)
- No, they are dependent on the individual particle energies. Individual particles don't have temperatures. Individual pairs of electrons in an atom don't each have their own temperature because each pair in an orbital has a different mean velocity from other pairs. That is madness. SBHarris 00:19, 5 April 2012 (UTC)
- I've checked your link and yes it discusses the population of different energy levels in a system of multiple particles. But I am at a loss to understand your argument that the position of a particle in a crystal renders it distinguishable whereas the position of a particle moving freely in space does not. The reason a particle in a crystal is fixed is because it is held there by forces from other particles, does this mean it has a temperature and the freely moving particle doesn't; this seems to deny the Reif formula 1/2mv2 = 3/2KBT. Let us assume that the fixed and freely moving have the same energy, are you saying they don't have the same temperature i.e. the temperature a particle depends, not on its energy, but where it is located?--Damorbel (talk) 06:46, 5 April 2012 (UTC)