Wikipedia:Reference desk/Archives/Science/2012 December 9
Science desk | ||
---|---|---|
< December 8 | << Nov | December | Jan >> | December 10 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
December 9
[edit]Is the refrigeration technique advanced enough to preserve ice in July in Hittin during the 12th century?
[edit]In the article Battle of Hattin there is a sentence:"The exhausted captives were brought to Saladin's tent, where Guy was given a goblet of iced water as a sign of Saladin's generosity." I checked Tiberias#Geography and climate where was near the location of battle, and found that only in extreme occasions did the temperature there reach zero. Those articles about refrigeration such as Refrigeration, Timeline of low-temperature technology are alomst devoid of information about that time (There are some remnants of earlier Ice house in China and Rome, but I am not sure if it helps to this question). So was ice ever possible at that time and that location?--Inspector (talk) 08:01, 9 December 2012 (UTC)
- Our article on Evaporative cooler doesn't mention the production of ice, but I believe it's possible at night in a dry atmosphere. In my student days when I couldn't afford a refrigerator, I kept milk cool by this method, but the humidity was too high to make ice. In some areas of desert, it is possible to produce ice at night purely by radiative cooling, using an ice pit insulated from the surrounding warm sand with straw, and the technique was developed in ancient times, with the resulting ice being stored in ice houses. Dbfirs 08:36, 9 December 2012 (UTC)
- I don't know if radiative cooling is really a feasible way of producing ice. The linked article has some doubts about it. OsmanRF34 (talk) 13:51, 9 December 2012 (UTC)
- I think it's possible, but only in certain conditions, especially including a clear dry atmosphere above the ice pit. The discussion seems to conclude that it doesn't violate any laws. Dbfirs 20:10, 9 December 2012 (UTC)
- It's also possible that they used salt to lower the temperature. 24.23.196.85 (talk) 00:07, 10 December 2012 (UTC)
- A couple of sources[1][2] state he had ice brought from the mountains. Clarityfiend (talk) 08:40, 9 December 2012 (UTC)
- Ice can be both be brought frozen over the mountains from pits dug for the purpose or brought from the mountains. μηδείς (talk) 04:26, 10 December 2012 (UTC)
Subconscious phrases in mind
[edit]Sometimes, but not often, I experience short phrases that pop in while I rest or sleep. It's not like I hear them, they just flare up in mind spontaneously, without my intent. The phrases are like "Want to take a photo?", "No, thanks" etc. They appear before I fall asleep or so when I'm not dreaming. I think it's related to brain rest, but not sure. Is there a name for such phenomenon?--93.174.25.12 (talk) 10:42, 9 December 2012 (UTC)
- In the Sleep article, it mentions a "non-REM stage 1", which might be what you're experiencing. When you're not quite awake and not quite asleep, things can seem "real" which aren't real. I've had this happen from time to time... as have many others, as this is presented as an explanation for some folks who think their bedroom is "haunted", as they feel as if they're being stifled by some spook... but that only happens when they're in that in-between stage, and it feels "real". Seems to me there's a more specific term than "non-REM stage 1", but it's not popping right now. P.S. If this is bothersome to you, as opposed to merely interesting, you should consider seeing a professional. (If nothing else, they could probably give you better terminology.) ←Baseball Bugs What's up, Doc? carrots→ 12:43, 9 December 2012 (UTC)
- Bugs, I think it's part of sleep paralysis. --TammyMoet (talk) 13:58, 9 December 2012 (UTC)
- No, this is known as "hypnagogic imagery" -- see Hypnagogia#Sounds. "Hypnagogia" is the formal term for the process of falling asleep. Looie496 (talk) 15:54, 9 December 2012 (UTC)
- Very good. I don't know if either terms are what I saw on TV, but sleep paralysis explains the spooky stuff, and hypnagogic imagery explains my experience perfectly... and it sounds like it would explain the OP's as well. ←Baseball Bugs What's up, Doc? carrots→ 17:13, 9 December 2012 (UTC)
- I believe I saw that as well, many years ago, on one of those "unsolved mysteries"-type shows. I don't recall the details of the show, but it led me to a book called The Terror That Comes in the Night, which was quite interesting, though it looked at the phenomenon through the paradigm of folklore study. By coincidence, a few years later my young daughter began experiencing night terrors, which is a closely related phenomenon; she never saw any hags though. Matt Deres (talk) 20:55, 9 December 2012 (UTC)
- As someone whose workplace was destroyed on 9/11, and who has been treated for night terrors, with good reason, I can assure you that mere vocalizations as one falls asleep do not count as night terrors. 04:24, 10 December 2012 (UTC)
- It's not a contest; night terrors often have no trigger in the way that nightmares (for example) often do; the experience itself is the terror as the half-asleep brain attempts to interpret the paralysis it suddenly "finds" itself in. Night terrors are not necessarily worse or easier to live with than nightmares; they're something else entirely. Matt Deres (talk) 01:42, 11 December 2012 (UTC)
- As someone whose workplace was destroyed on 9/11, and who has been treated for night terrors, with good reason, I can assure you that mere vocalizations as one falls asleep do not count as night terrors. 04:24, 10 December 2012 (UTC)
- I believe I saw that as well, many years ago, on one of those "unsolved mysteries"-type shows. I don't recall the details of the show, but it led me to a book called The Terror That Comes in the Night, which was quite interesting, though it looked at the phenomenon through the paradigm of folklore study. By coincidence, a few years later my young daughter began experiencing night terrors, which is a closely related phenomenon; she never saw any hags though. Matt Deres (talk) 20:55, 9 December 2012 (UTC)
- Very good. I don't know if either terms are what I saw on TV, but sleep paralysis explains the spooky stuff, and hypnagogic imagery explains my experience perfectly... and it sounds like it would explain the OP's as well. ←Baseball Bugs What's up, Doc? carrots→ 17:13, 9 December 2012 (UTC)
- No, this is known as "hypnagogic imagery" -- see Hypnagogia#Sounds. "Hypnagogia" is the formal term for the process of falling asleep. Looie496 (talk) 15:54, 9 December 2012 (UTC)
- Bugs, I think it's part of sleep paralysis. --TammyMoet (talk) 13:58, 9 December 2012 (UTC)
Humidifiers and temperature
[edit]The manual of my (ultrasonic) humidifier states that increasing the humidity with it will increase the felt air temperature and effectively reduce heating costs. Two questions: 1) Does higher humidity always mean higher felt air temperature or does it depend on temperature as well? At least to me, for sub-zero (Celsius) temperatures, a dry climate is much more pleasant than a wet one. 2) All costs included, can a humidifier really lower total expenses? bamse (talk) 19:53, 9 December 2012 (UTC)
- Higher humidity will slow evaporation from your skin; and it increases the heat capacity of the air, which can make you feel warmer in hot temperatures and cooler in cold temperatures, also depending on the air flow, and other factors. See heat index for the combined effect of heat and humidity. μηδείς (talk) 20:05, 9 December 2012 (UTC)
- Thanks. Any ideas on the second question? bamse (talk) 19:14, 10 December 2012 (UTC)
- There are many factors that go into the second calculation:
- 1) What type of humidifier is it ?
- a) You said yours is ultrasonic. Now, the energy usage for that should be low, however, you need to use distilled water, since it will otherwise vaporize all the minerals in the water and coat the area with dust. So, the question comes up as to how much you pay for your distilled water. If you buy it, that will get expensive fast. If you distill it yourself, then the energy to do that must be considered. If you just use tap water and ignore the dust, the cost will be lowest.
- b) An electric evaporation humidifier will cost a lot, too, because electricity is expensive. Tap water can be used, but will create mineral deposits in the humidifier (which is much better than on your walls).
- c) A natural gas humidifier will be less expensive. I essentially do this by having a huge stock pot on the stove with a tiny flame always under it. This has the advantage that moisture is also an exhaust product from combustion, so you get moisture from two sources. As above, tap water mineral deposits go into the pot, so don't plan on using it for food again. Use a huge pot so you can leave it on overnight or while at work without danger of it running dry (mine lasts about 2 days). Another approach is always having a big pot of stew of the stove in winter.
- d) An integrated humidifier which works with the furnace, using air already heated there, is probably best of all. This also has the advantage of distributing it throughout the house. (With any single source of humidity, you will see higher humidity in that room, possibly with condensation at the windows leading to mold, and lower humidity elsewhere.)
- 2) What type of heat does your home have ?
- a) Natural gas heating is the least expensive (unless you count heat pumps, in the temperature range where they work). However, the water vapor generated by combustion goes up the chimney, along with some of the heat.
- b) Electrical is the most expensive, except that some forms can be set up for zone heating, so you only heat the areas you want.
- 3) How well insulated is your home for heat and humidity ? A poorly insulated house will lose heat and humidity through the walls, and water will also condense on the windows, and maybe walls, causing mold.
- So, it's probably impossible for us to judge which is cheaper, in your home. However, humidity should be kept up for other reasons, like avoiding dry skin, callouses, chapped lips, cracked wooden furniture, etc. StuRat (talk) 19:40, 10 December 2012 (UTC)
entropy in information science and thermodynamics
[edit]what, if anything, is the relationship between the word "entropy" used in information science (e.g. 2 bits of entropy per english letter, etc) and entropy in thermodynamics? if nothing, why is it the same word - historical coincidence? or is there nevertheless something in common. thanks. 178.48.114.143 (talk) 20:45, 9 December 2012 (UTC)
Compare the formula for entropy from Entropy (statistical thermodynamics): with the formula from Entropy (information theory): . Dauto (talk) 21:20, 9 December 2012 (UTC)
- Can you be a little more vague or general, please? I understand that your answer is very precise but I am neither a mathematician or physicist. I understand the latter because it's "obvious" to me or intuitive, but not the thermodynamic version. Why is it not the same formula? Could you 'dumb it down a tad' for me? more importantly, can you explain the relationship between the concepts. thanks. 178.48.114.143 (talk) 21:24, 9 December 2012 (UTC)
- The difference between the formulae is just cosmetic. Both formulae define entropy as the negative of a sum each term of which is given by a probability times its logarithm. Dauto (talk) 22:59, 9 December 2012 (UTC)
- This doesn't help me at all. Are the units the same? Is the concept the same? What are we talking about and what does heat have to do with winzip (for example ; i.e. higher-entropy files will be larger when zipped). I get the latter concept and would like to know what, if anything, this has to do with thermodynamics. Thanks. 178.48.114.143 (talk) 23:15, 9 December 2012 (UTC)
- It is not a coincidence. Thermodynamic entropy came first, but the same concept was later applied to information. See Entropy in thermodynamics and information theory, which handles them both and describes the ways they do or don't overlap. One early and interesting example of the interrelation of thermodynamic and information principles is Leo Szilard's informational interpretation of Maxwell's demon and its implications for the second law of thermodynamics. --Mr.98 (talk) 23:43, 9 December 2012 (UTC)
- Guys, you're asking me to read these articles (I'd found the one you just linked already) and I'm saying it's too hard. Can you help? Thanks. 178.48.114.143 (talk) 23:52, 9 December 2012 (UTC)
- (I didn't mean to annoy you by posting a relevant link — we lack the ability to know what you have and have not read, if you do not tell us, and figured that if you had looked at the article, even uncomprehendingly, you'd have understood that the shared language was not one of historical coincidence — the very fact that there is an article seems to indicate that, I thought.) --Mr.98 (talk) 01:46, 10 December 2012 (UTC)
- Guys, you're asking me to read these articles (I'd found the one you just linked already) and I'm saying it's too hard. Can you help? Thanks. 178.48.114.143 (talk) 23:52, 9 December 2012 (UTC)
- In both cases, entropy is a precise way to compute something that corresponds to the "randomness" of the system. Randomness is sort of vague and poorly defined. In information-theory contexts, we might hand-wavingly say that random information content is less structured, or has less redundant/repeated information. In statistical physics or thermodynamics, we might say that random arrangements of molecules have more atoms flying around in more different directions. All of these hand-wavey imprecise statements leave a lot to be desired, because we can't compute values for such vague concepts. So, we define entropy, which is a convenient and precise way to express this sort of concept. And, as it happens, our equation definition is useful because we can relate it to other computed quantites. We have equations to relate entropy and temperature, in the case of thermodynamics. In information theory, we have equations to relate entropy and algorithm efficiency, or entropy to expected data loss rate, and so forth. Nimur (talk) 00:01, 10 December 2012 (UTC)
- This is very helpful. Are we saying that the entropy is LITERALLY the same concept, or an analogous concept? By literally the same concept, I mean, "air molecules on one side vacuum on the other take less bits to describe than in a highly mixed state" just like a file that is five million ones and then five million zeros takes very few bits to describe - it has little entropy. To me, entropy isn't "counter-intuitive" at all because I always equate it with the size if you try to compress or describe it (e.g. winzip). So is it literally the same concept - if you had a "replicator" and sent across a description of the system to be replicated perfectly, then you don't need very much information (it "zips" well) if it's all air molecules on one side and vacuum on the other (e.g. low-temperature) but if you wanted to replicate it perfectly at a higher temperature or better-mixed then you need more information? Is this LITERALLY the same concept (not just an analogy)? Or am I now going too far? 178.48.114.143 (talk) 00:45, 10 December 2012 (UTC)
- In both cases, entropy is a precise way to compute something that corresponds to the "randomness" of the system. Randomness is sort of vague and poorly defined. In information-theory contexts, we might hand-wavingly say that random information content is less structured, or has less redundant/repeated information. In statistical physics or thermodynamics, we might say that random arrangements of molecules have more atoms flying around in more different directions. All of these hand-wavey imprecise statements leave a lot to be desired, because we can't compute values for such vague concepts. So, we define entropy, which is a convenient and precise way to express this sort of concept. And, as it happens, our equation definition is useful because we can relate it to other computed quantites. We have equations to relate entropy and temperature, in the case of thermodynamics. In information theory, we have equations to relate entropy and algorithm efficiency, or entropy to expected data loss rate, and so forth. Nimur (talk) 00:01, 10 December 2012 (UTC)
- Ok, you asked for it -- here's a very hand-wavy discussion that is not rigorous or precise, but I think captures the general similarities. The Shannon notion of information (Information_theory#Entropy) equates information with randomness. The idea is that there is more "information" in a specific random string, than a specific string with a lot of structure. For instance, Let X be the string with 50 ones and 50 zeros. This has low information, because it can be described much more succinctly, and we could transmit or store that shorter expression, rather than the whole string. On the other hand, let Y be a "random" string of 100 ones and zeros, say Y=10101110000101000011110100101010100101001...1
- Y carries more information than X because we'd basically have to record every digit to faithfully transmit that string. Though I'm speaking qualitatively, we could work out specific quantities of entropy for X and Y by computing the formulae that are (tersely) quoted above. This discussion also matches up with the "entropy is a measurement of disorder" analogy in physics. In that case, X is similar to a box with air on one side and vacuum on the other, and Y is similar to a box of air at uniform pressure. So, the mathematical form of "structure vs. order" in physics is also useful for describing "high vs. low information". Counter-intuitively, the "structured" case, X, corresponds to the low information example. Another application of this same idea is to measuring biodiversity, see Shannon index.
- In short, we use the same word because they are mathematically the same thing. The differences are in the way that we interpret the math, in hopes of that interpretation being useful for solving a certain problem or discussing a certain field of science. SemanticMantis (talk) 00:02, 10 December 2012 (UTC)
- This is very helpful. See my question imm. above at same indent level as this. 178.48.114.143 (talk) 00:45, 10 December 2012 (UTC)
- Let me give you a very explicit answer this time. Yes, the two concepts are LITERALLY the same concept, but applied to different situations, and that's why the mathematical formulae are identical. Dauto (talk) 14:29, 10 December 2012 (UTC)
- Is my replicator analogy also 100.00% correct? Imagine you have a supernatural replicator that can make anything as described (that comes over its communications link). Is it the case that to perfectly replicate a hot drink, you need more bits (more coming over the line) than to perfectly replicate a cold drink? I mean, imagine that you describe an x by x by x cold vacuum, and it replicates that for you. That takes almost no bits, it's like zipping an empty file. Then as you are replicating more and more complicated things, it takes larger and larger space to get all that to the replciator, even compressed. Now here is the question: IS THE THERMODYNAMIC ENTROPY IN AN OBJECT LITERALLY EQUIVALENT TO HOW MUCH ENTROPY ITS DESCRIPTION TO THE REPLICATOR CONTAIN? Please bear in mind that the questino this time is hether this description is NOT hand-wavey but 100% perfectly rigorous. Is what I've just said, technically, perfectly, rigorously true in a mathematical and physical sense? Thank you. 178.48.114.143 (talk) 19:09, 10 December 2012 (UTC)
- note also that by introducing the supernatural replicator I have stopped having to talk about "concepts" and reduced the question to a yes-or-no truth value about how many bits must get to the replicator given a certain object given optimal compression, and whether this is literally the same "number" as its thermodynamic entropy. There is no longer any question of "concepts" but instead, teh truth-value in a thought experiment. THank you. 178.48.114.143 (talk) 19:12, 10 December 2012 (UTC)
- Yes, yes. The two concepts of entropy really are identical. The only difference is choice of units. Information entropy is usually measured in bits or nats while the thermodynamic entropy is measured in Joules/Kelvin. That's why the formula for the thermodynamic entropy is multiplied by Boltzmann constant which is nothing more than a conversion factor between those choices of units. Dauto (talk) 20:45, 10 December 2012 (UTC)
- Achievement unlocked! Thank you, Dauto, and others, for your answers. In fact I now feel (if it really is this simple :) ) that I have understood it very well. Forgive me if I seemed incredulous - the relationship seemed a lot more complicatd than before I asked this question :). Thanks for all your help! 178.48.114.143 (talk) 21:30, 10 December 2012 (UTC)
- You might find Entropy in thermodynamics and information theory interesting. Dauto (talk) 18:40, 12 December 2012 (UTC)
Chloromethane on mars
[edit]The SAM instrument on Curiosity is a pyrolysis GC-MS with a relative slow heating ramp of less than 50°C / minute. They found chlorinated methane [3] in a sand dune on mars (they choose it because it would be the point with only minimal organics). (Viking found the same stuff in a sand dune in the 1970s) It is a very small amount. The other results make it relative obvious that perchlorate is present. Perchlorate decomposes to oxygen and chlorine. The source of carbon might be earth organics traveling with the rover to mars, mars organics, or inorganic carbon from carbonates.
Is it possible to get chloromethane from carbonates and perchlorate? Carbonates decompose and give off carbon dioxide which decomposes at temperatures above 800°C forming carbon and carbon monoxide Boudouard reaction. I can come up with no good rection forming that compounds from carbonates. --Stone (talk) 23:34, 9 December 2012 (UTC)
- Neither can I, and I'm very familiar with these types of reactions. 24.23.196.85 (talk) 00:05, 10 December 2012 (UTC)
- I can't answer the question directly but, chloromethane is known to be produced by several species of phytoplankton here on Earth. It's also seemingly becoming clear that liquid water once existed on mars, and as per [4], it's looking more certain than ever that life once existed on Mars, it's possible the chloromethane they found could be organic in origin. douts (talk) 22:14, 11 December 2012 (UTC)
- Fixed link. Evanh2008 (talk|contribs) 22:39, 11 December 2012 (UTC)