Wikipedia:Reference desk/Archives/Science/2014 December 20
Science desk | ||
---|---|---|
< December 19 | << Nov | December | Jan >> | December 21 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
December 20
[edit]Tiny islands
[edit]Google Maps (I have no real data to back this up) makes it look as if there are many mountains in the sea of which just the top is above sea level. So if the sea level would go up a little, say 300 ft, many of them would disappear, where if it would go down a little, again 300 ft, a lot less new islands would appear. Is that true, and if so how come? Erosion perhaps? Joepnl (talk) 00:04, 20 December 2014 (UTC)
- Yes, there are lots of different types of islands. Some are newly created by undersea volcanic activity, others are the remains of eroded rock, and yet others are mountaintops from before the last post-glacial rise in sea level. See New islands and Sea level#Changes through geologic time for a few details. I'm sure some experts here can add further to this reply. Dbfirs 00:14, 20 December 2014 (UTC)
- Other relevant articles include seamount and guyot.--Jayron32 00:56, 20 December 2014 (UTC)
- (edit conflict) When a new volcanic island forms, a thin ring of coral forms around the edges. The volcano becomes extinct (often cause the crust leaves the hotspot of the mantle that caused the volcano behind), the mountain erodes and/or subsides, leaving behind a coral reef on top of a submerged mountain. The corals cannot live where it's too dark (tens of feet or meters, it's one of those, I don't remember) and will die if they're not wet. But no problem, if they die from depth new ones build their exoskeletons on top of the dead ones until they reach the tide level and can't grow up anymore, so they are always right below the sea level if they've been there enough millennia. 100,000 years ago the sea level was 20 feet higher than now in one of the hottest periods in millions of years (though I think this would happen (if not double) if we burn the carbon till 2100 or something and then wait for the ice to stop melting (centuries), so not especially hot by 2014 standards), the highest sea level in at least 400,000 years and maybe millions (I don't remember), 100 Kyr is short enough that the coral reefs killed by the low sea level of the last ice age have not have time to be eroded away yet and are still there as islands. Sagittarian Milky Way (talk) 01:27, 20 December 2014 (UTC)
- Also, it is often in the nature of land to be barely above sea level (wave and freak 5000 year hurricane created islands like Coney Island and the more well known Atlantic City island, river deltas, river islands, coastal wetlands.. Those don't look like mountains on Google Earth, though. Sagittarian Milky Way (talk) 01:38, 20 December 2014 (UTC)
- If the surface of the Earth were randomly bumpy, the greatest number of islands would exist when the sea covered half its surface (for a proof of this, ask at the math desk). If the sea level sank, so that more than half of the surface was land, more of the random islands would become part of the mainland. If the sea level rose so that more than half the random surface was covered with water, more islands would sink below the surface.
- The Earth has two great differences: the ocean covers 71 percent of it, and its surface is far from random. So the question is an empirical one which I cant answer. But see Zealandia and Kerguelen Plateau for interesting reading. μηδείς (talk) 04:18, 20 December 2014 (UTC)
- If Venus were terraformed and given an ocean would probably have to be made of comets) Earth would likely have almost 50% more land by percentage and many more islands (Venus has only two continents). If Mars was terraformed it would likely be only 33% water in one ocean and have few islands. Go figure. Sagittarian Milky Way (talk) 16:29, 20 December 2014 (UTC)
- You seem to be assuming arbitrary water levels? Wnt (talk) 17:38, 20 December 2014 (UTC)
- @Wnt, if anyone still cares, I was assuming that staying indoors when the weeks long days cause jet lag would be annoying enough, they wouldn't want depressing near-constant heat-shielding cloud on top of that (even if all the comets or moons hit on the side that increases spin the day will be long) So you would want to avoid deserts caused by too large landmasses. Deserts next to water cause the highest heat indexes on Earth. Thin air would be good for reducing the avg temperature of high latitudes, as is the low axis tilt, but a lot of water would be needed to reduce the day-night temperature differences. Do might as well just keep Ishtar Terra and Aphrodite Terra above water. I'm not sure if the tropical one would be habitable, maybe to genetically engineered rainforests? I would want low albedo = high land on Mars, and the thick greenhouse effect of extra GHGs should trap heat and keep the continentality from getting extreme. It also avoids wasting land and fills in the ocean to its original beachline. See: File:Mars_topography_(MOLA_dataset)_with_poles_HiRes.jpg and Oceanus Borealis. Sagittarian Milky Way (talk) 01:47, 23 December 2014 (UTC)
Were blacksmiths losing their hearing?
[edit]I'm not sure whether I'm in the right section of RD, my question is rather between history, trade and medicine. Traditional blacksmiths of the past were being exposed to constant metallic noise during their life, but did that make any impact on their hearing? Were they losing their hearing during their life? Do blacksmiths use any protection for their ears today?--Lüboslóv Yęzýkin (talk) 08:47, 20 December 2014 (UTC)
- In the introduction to An Inevitable Consequence: The story of industrial deafness (if you google that title you will find an online PDF copy) Dick Bowlder writes "There are references going back over several hundred years to the fact that some noisy occupations - in particular those involving the hammering of metal - will cause permanent deafness or tinnitus. Tinsmiths in the middle ages had “ringing in the ears”. But the first authoritative reference was in 1831 when Dr Fosbroke, writing in The Lancet, states that "Blacksmith's deafness is a consequence of employment.” And yes, blacksmiths today use hearing protection [1][2] Richerman (talk) 11:13, 20 December 2014 (UTC)
- See Google Books for Fosbroke's original article, and here for Bowdler's paper. Fosbroke cites Daniel Sennert (in Latin) for his own historical reference on the subject. Tevildo (talk) 11:17, 20 December 2014 (UTC)
- Thank you both! My grand-dad (unfortunately he died long before my birth) was both a skilful blacksmith and a talented musician who played the button accordion. I wonder how these two activities could combine. Probably he played music deliberately as a compensation to his noisy work. Interesting how many blacksmith-musicians there were (and are).--Lüboslóv Yęzýkin (talk) 00:10, 21 December 2014 (UTC)
- So, after a while a blacksmith's hammers and anvils would wear out ? :-) StuRat (talk) 00:31, 21 December 2014 (UTC)
Is not keeping bread in a fridge really a good piece of advice?
[edit]It is commonly stated (including in the WP article: Staling) that keeping bread in a fridge makes it go stale (or more precisely: speeds up the staling process).
This contradicts my own experience: I keep my bread in a fridge at between 0 and 5 Celsius. I keep it in the fridge to stop it going mouldy (or more precisely: to slow down the going mouldy process) and I have not experienced my bread going stale, (even if I keep it long enough to start going mouldy even in the fridge).
Some possibilities:
- it depends on the type of bread (though the advice I constantly come across does not refer to particular types of bread),
- the staling referred to is not something that bothers me.
I have come across one person checking for themselves [3] , but they used white baguette loaves from a local bakery, which is probably not the most commonly consumed type of bread (in the UK, and many other countries, at least).
The type of bread I am referring to as not seeming to go stale in the fridge is cheap supermarket own-brand, medium sliced, wholemeal, in a plastic bag with no holes, bought in the UK. Ingredients: Wholemeal Wheat Flour, Water, Yeast, Salt, Spirit Vinegar, Emulsifier (Mono- and Di- Acetyltartaric Esters of Mono- and Di-Glycerides of Fatty Acids, Sodium Stearoyl-2-Lactylate), Soya Flour, Rapeseed Oil, Preservative (Calcium Propionate), Palm Fat, Flour Treatment Agent (Ascorbic Acid).
I notice that some of these ingredients are mentioned as anti-staling agents in the Staling article.
Why does this issue matter? People might be wasting bread because they believe the advice about not keeping bread in a fridge, and thereby having it go mouldy. (Advice is often given that bread can be frozen instead, and not go stale or mouldy, but this is less convenient and so less likely to be done.)
Note that this is not a question about why refrigerating bread makes it go stale (I have found copious information about that), but whether it is really true and a significant effect for all types of bread, and therefore whether saying not to put bread in the fridge, without further qualification, is a good piece of general advice. FrankSier (talk) 12:58, 20 December 2014 (UTC)
- I'm from the UK and my family doesn't keep bread in the fridge - not because of anything we've been told, but simply because we have a bread bin, and the fridge has other things in it which need to be kept in there. We do, however, keep bread in the freezer, then bring it out when we need it (thawing it out first, of course), and that has never affected the bread in any way. I think this idea of bread going stale when put in the fridge comes from people putting sandwiches in the fridge (sometimes half eaten), so, being exposed to the air, they will go stale (whether they are in the fridge or not). Putting cling-film over them helps to preserve the bread, in my experience. KägeTorä - (影虎) (Chin Wag) 17:52, 20 December 2014 (UTC)
- @FrankSier: The Staling page is not well referenced, having only one footnote to a ten year old book, though the first rather technical external link is downloadable for free. It does say bread "... stales most rapidly at temperatures just above freezing" Perhaps the chemical composition of bread has changed in the last ten years, and this is no longer accurate? Then again, a quick Google turned up heaps of sites saying that bread does indeed go stale faster if stored in a fridge! By up to six times! [4]
- It certainly appears that the prevailing advice is to store bread at 'room' temperature. (or freeze it)
- Comment on mold. Totally personal OR, avoid touching the bread with your hands and you will get far less mold. You might want to try it, get a piece and put your thumb on it, then leave it to 'moulder'. There is a very high likelihood that you will get a thumb shaped patch of green.
- More OR, there are many types of bread and I have found that some types, IIRC unsliced wholegrain are more resistant to going stale (and to mould) Though often the mould gets the bread before the 'stale' (especially in humid summer weather as here in Australia. Ǝ 220 of Borg 18:03, 20 December 2014 (UTC)
- Wikipedia has an article on this! Retrogradation (starch) explains the temp effect. The solution is to buy that white plastic foam stuff from the mall that comes in plastic bags, often miss-labeled as bread. It has additives to ensure that it remains as tasteless and bland as the day you purchased it. Enjoy. --Aspro (talk) 20:10, 20 December 2014 (UTC)
- Note that the type of refrigerator also makes a difference. Cheap frost-free freezer/fridge combo units periodically heat the freezer to drive off the frost, and that might affect the fridge compartment temperature, too. You want a stable temperature to retain moisture. Constantly changing the temp will tend to cause water migration, causing bread to either get stale or soggy (and probably moldy, too). StuRat (talk) 20:28, 20 December 2014 (UTC)
- Linguistic tangent here — to the best of my recollection, this is the first time I have ever encountered the word "staling". Is it technical jargon among food scientists? Or maybe a UK thing? --Trovatore (talk) 20:15, 20 December 2014 (UTC)
- Same here (Detroit). I would say "going stale". StuRat (talk) 20:24, 20 December 2014 (UTC)
- Don't know if any of you folks have heard of a little amateur project called Wikipedia but they have an article on Staling :-¬ ) P.S. Stu, you forgot a “/” before 'small' in your last post. --Aspro (talk) 20:37, 20 December 2014 (UTC)
- Thanks, but our article doesn't discuss the linguistic aspects of the term. I would guess "staling" is UK-English, while "going stale" is US-English. StuRat (talk) 21:05, 20 December 2014 (UTC)
- How about verbification then? Or the American habit of turning a noun into a verb in order to avoiding having to learn correct English grammar.--Aspro (talk) 21:24, 20 December 2014 (UTC)
- No, StuRat, the verb "to stale" does not exist in British English with that meaning. I thought it must be American when I saw the article. Staling in British English means putting rungs on a ladder (or possibly urinating if you are talking about horses). Dbfirs 22:20, 20 December 2014 (UTC)
- Then what is it, Australian English ? Wiktionary lists it, but not where on Earth it's used: staling. StuRat (talk) 22:27, 20 December 2014 (UTC)
- I was wrong to say that it doesn't exist, but it's certainly not used in modern British English. The OED has a sense "To grow stale; get out of fashion, become uninteresting" with a couple of cites from the nineteenth century. I'm puzzled to understand why Wikipedia has an article that doesn't use modern English. Merriam-Webster has "to become stale" but, from the reactions above, I deduce that the verb is as rare in America as it is in the UK. Google Books seems to indicate that the verb is used of bread by food scientists, though many of the authors have non-English surnames. A couple of authors with English surnames have published books in America with this usage. Google ngrams seems to indicate that the word is used in both American and British English, but is similarly rare in both. Perhaps it's just restricted to food science? Dbfirs 22:50, 20 December 2014 (UTC)
- I suspect that the reason for the article name is that the alternative "Going stale" is an unusual form for an article name. StuRat (talk) 23:21, 20 December 2014 (UTC)
- Staleness would be a much better article name, IMO, and the redirect already exists. I was only aware of the equine usage until now - thanks to Dbfirs for informing us of the ladder usage. Should this go to WP:RM? Tevildo (talk) 23:44, 20 December 2014 (UTC)
- Ironically, the ladder usage is not the latter usage. StuRat (talk) 23:49, 20 December 2014 (UTC)
- "Staleness"? don't be silly. Hale is to health as stale is to stealth. μηδείς (talk) 00:34, 21 December 2014 (UTC)
- ...and wale is to wealth ? :-) StuRat (talk) 00:40, 21 December 2014 (UTC)
- Don't be silly, that is as weal is to wealth. μηδείς (talk) 05:40, 21 December 2014 (UTC)
- @Aspro: Our OP actually mentions the Staling page in their post, about 'para' 6. Concur about the mislabelled 'foam'. A major supermarket here (Oz) got taken to court for advertising their 'foam' as "fresh bread" when it had actually been partly baked up to 6 months before! [8] [9]. 220 of Borg 05:22, 21 December 2014 (UTC)
- FrankSier On the "Bread and the technology of bread production" webpage, 'Section 3.3. Staling of bread' says
- "Storage temperature is an important factor to be considered in any discussion of bread staling. Staling becomes more rapid as the temperature of storage is reduced from room temperature to 35°F. Below 35°F., staling becomes slower as temperature is lowered, until at 0°F. [frozen] it is very slow, and bread products will keep for months without apparent staling." [10] www.classofoods.com
- Nb. 35 °Fahrenheit =1.67 °Celsius. So I hope that helps, though it apparently contradicts a lot of websites and the earlier quote "... stales most rapidly at temperatures just above freezing". Keeping your refrigerator at a constant say 1.0 °C is likely to be a problem, which might be why conventionally people are saying to not put bread in the fridge to keep it 'fresh'. Likely it will either be slightly too warm or freeze. The page also gives technical descriptions of the staling process that may be easier to follow than some of the scientific papers I linked to earlier. 220 of Borg 05:22, 21 December 2014 (UTC)
- FrankSier On the "Bread and the technology of bread production" webpage, 'Section 3.3. Staling of bread' says
A to D s
[edit]What sort of A/D s are used in GHz sampling rate digital scopes ?--86.169.152.43 (talk) 13:31, 20 December 2014 (UTC)
- Fast ones? The usual suspects (Analog Devices, Maxim, TI) all sell ADCs in that sort of range. The exact model used in a particular scope will generally be commercially sensitive. Tevildo (talk) 19:34, 20 December 2014 (UTC)
- David L. Jones does a teardown of two Agilent scopes in these YouTube videos - the 3000 series and the incredibly expensive 90000 series. You can search his site for teardowns of different scopes and other pieces of test and analysis equipment. -- Finlay McWalterᚠTalk 19:46, 20 December 2014 (UTC)
Could fat tissue be a beneficial cancer?
[edit]Naybe the ancestral version of the fat cell was a more harmful cell, and it evolved to be more beneficial? Thanks.2601:7:6580:5E3:7CBD:D2E0:7058:C21D (talk) 18:00, 20 December 2014 (UTC)
- Neigh be my answer to that possibility. Why would you think that ? StuRat (talk) 20:30, 20 December 2014 (UTC)
- Fat cells aren't an infection, they are cells which develop normally in various differentiations from the body's stem cells, originating with a single fertilized egg. See also cell type, lipoma, and liposarcoma. Under the article cell type expand the mesoderm] section, you will see that fat cells (Lipoblast → Adipocyte) and muscle cells differentiate from the same progenitore as bone and cartilage cells. μηδείς (talk) 20:51, 20 December 2014 (UTC)
- As explained in the article, adipocytes famously don't replicate under most circumstances, including weight gain. This actually puts them at the far end of the spectrum from cancer cells that undergo uncontrolled replication. (of course, liposarcomas do manage to replicate, or they wouldn't be cancer) Wnt (talk) 23:53, 20 December 2014 (UTC)
- adipose tissue excretes estrogen and the latter appears to be a risk factor for a number of cancers. I believe overweight people are more susceptible to the latter. --AboutFace 22 (talk) 23:30, 21 December 2014 (UTC)
Could slow-growing cases of prostate cancer be beneficial in some cases?
[edit]Note:i'm not asking about if a gene that causes or predisposes to prostate cancer also has beneficial effects. I mean does the cancer it self help in some way, for example, like kicking in increased ability of sperm to impregnate.I suspect a counterargument will be that it occurs at advanced age that we haven't until the last few thousand years survived to, so we haven't had time to evolve to make it serve a purpose. But it could have been triggered at a younger age back when we lived less long. thanks.Rich (talk) 18:23, 20 December 2014 (UTC)
- I doubt that, but I've noticed that organs which change their function during our lives are quite prone to cancer and/or benign tumors. This would include our reproductive organs and female breasts, which change during puberty and again during pregnancy. They all have a built-in design to "wait until you get the signal, then grow rapidly", so it's not that surprisingly that the "wait until you get the signal" part gets ignored at times. StuRat (talk) 20:36, 20 December 2014 (UTC)
- The genes in prostate tissue that are blamed for cancer are the result of natural selection, thus likely (though not absolutely certainly) beneficial. (An example of why they wouldn't be is if environment has changed or if there has been rapid selection for change at some other site in the genome that makes their present form maladaptive) Now as prostate cancer runs through the gamut of the Gleason Grading System, it gradually goes from being pretty much harmless (except by implication of what will happen) to outright dangerous; this requires mutations that benefit the individual cells but are definitely not part of the starting genome. You can argue that the genetic code that makes such mutations possible is also selected for; but whether that makes these mutations 'part of the plan' is sort of a philosophical question. Like a lot of theoretical questions in biology, it seems to become something of a mirage the more closely you look at it. Wnt (talk) 00:00, 21 December 2014 (UTC)
- Re: "The genes in prostate tissue that are blamed for cancer are the result of natural selection, thus likely (though not absolutely certainly) beneficial." By that logic most genetic diseases would be beneficial. The sickle-cell gene is one of the few where this seems to be the case.
- Mutations naturally occur, some of which cause genetic diseases. If they prevent reproduction, then they won't be passed on, but the same mutation can always recur. Since prostate cancer often occurs late in life and is fatal, if ever, even later still, there wouldn't be much evolutionary pressure to prevent it. StuRat (talk) 00:06, 21 December 2014 (UTC)
- What's the top of the Gleason scale labeled ? "To the Moon, Alice". :-) StuRat (talk) 00:08, 21 December 2014 (UTC)
- Well, for example, metformin appears to reduce the risk by turning down the production of c-MYC protein. [11] The prostate could have evolved to have lower levels of c-MYC on its own, without prompting. Now, does that mean that there is a compensating advantage, maybe the man produces more nutritious semen that puts more of a spring in the step of his spermatozoa? Or does some aspect of the modern diet of processed food or endocrine disruptors have toxicity that metformin reverses by the same means as it tends to oppose high blood sugar? Or is it some unintentional negative aspect of the continual rapid evolution of sperm in competition with other males? Well, if I looked it up harder on PubMed I could likely find out, but for purposes of discussion here, well, the point is, it could be a number of things. The point is, the organ doesn't come ready made to be cancer proof as its only priority, or else prostate cancer would be a lot less common - and by extension, it is possible to look for treatments that will reduce the risk, though there may be unintended side effects. Wnt (talk) 00:33, 21 December 2014 (UTC)
Electrical wiring – wrapping electrical tape around swtiches and receptales
[edit]Is wrapping electrical tape around switches and receptacles (inside electrical boxes) required, recommended, or prohibited? I've seen wiring done either way, i.e. with and without taping. I've done some searching; it seems that even among electricians there's no clear-cut answer. Many seem to be of the opinion that it's unnecessary, but some say it may be beneficial sometimes. Does the National Electrical Code say anything about the practice? What is the common/recommended/prescribed practice in other parts of the world? Thanks. --173.49.11.192 (talk) 19:44, 20 December 2014 (UTC):
- Visitors to the US often suffer a culture shock. On the one-hand everything’s all hi-tech, then on the other-hand, the electrical practices and mains power quality are third world. This is a UK site on DIY: Electrical connection--Aspro (talk) 20:26, 20 December 2014 (UTC)
- It seems like it could be counter-productive, to me. Specifically, it could act as a thermal insulator, allowing the wires to overheat, especially if there's an intermittent connection inside the box. Also, I don't expect electrical tape would last for decades, and once it gets old, then what do you do, open up all your walls and replace it ? StuRat (talk) 20:41, 20 December 2014 (UTC)
- ??? There is no thermal insulation issue at all. If you have an intermittent connection you have bigger problems, the tape certainly won't hurt. And yes, the tape will last for decades, but it doesn't matter - the tape is really only for the 5 minutes when you put the outlet back in the wall, and you are working live, otherwise you don't need it. Ariel. (talk) 20:19, 21 December 2014 (UTC)
- I disagree with you there. Especially in old homes, they specifically tell you not to add blown in insulation, as this can cause the wires to overheat. Wrapping electrical tape around everything is sure to add some thermal insulation, even if just by blocking air flow. And yes, if you are aware of an intermittent connection, you should have it repaired, but not everybody is aware in time to prevent a fire, or can afford to have it fixed. (I have an intermittent connection in the bathroom overhead lights of my 1920's house, and, not wanting to pay an electrician hundreds of dollars to rip open my stucco ceiling to make repairs, I just disabled them at the wall switch and put a floor lamp in that room.) StuRat (talk) 15:38, 22 December 2014 (UTC)
- The US home improvement show This Old House has a video about this. The wrapping with electrical tape is shown beginning at minute 2. I think the thermal insulation issue would be minimal for the method shown in the video; it isn't "wrapping electrical tape around everything". Personally, I think I'd rather have my circuit breaker trip if some stray piece of metal touched one of the hot screws, rather than having a single layer of old electrical tape provide partial protection. Jc3s5h (talk) 16:30, 22 December 2014 (UTC)
- "Working live", as in working with wiring while the power is on? That seems incredibly reckless. Sjö (talk) 09:43, 25 December 2014 (UTC)
- I'll second Aspro's answer. The question amazes me. Required? No. Reccommended? Not in this neck of the wood. Prohibited? Possibly, though there are things that are so ridiculous, that no-one will contemplate makings laws explicitly stating that they are prohibited. Dangerous? Hmm, it depends. I'd guess that it usually isn't, except when it is. Clearly difficult to regulate. Third world? Definitely. --NorwegianBlue talk 21:07, 20 December 2014 (UTC)
- Thanks for the replies so far. About the comment that US electrical practices look third-world to visitors, I'm not sure about that. People here are not dying in droves in electrical fires or of electrocution. I take that as empirical evidence that the electrical practices here are at least reasonable. On the comment that there are things so ridiculous that no-one would contemplate making laws explicitly against them, my belief is that if something is ridiculously dangerous, it will be prohibited under some general rules, if not detailed, specific ones. --173.49.11.192 (talk) 22:10, 20 December 2014 (UTC)
- Yes, beyond a certain point more safety regulations may actually make things less safe. For example, if you require smoke detectors that are so sensitive they go off every time you cook, then people will disable them and be less safe. StuRat (talk) 22:22, 20 December 2014 (UTC)
- "people are not dying in droves in electrical fires"...oh, some *on*. It took me about 60 seconds to fact-check your comment (something which you should have done before making it). the ESFi report for 2014 says: "In the United States, 50,900 fires each year are attributed to electrical failure or malfunction, resulting in 490 deaths and 1,440 injuries. Arcing faults are a major cause of these fires."...is 490 per year a 'drove'? I think so. In the UK, there were 25 deaths due to electrical fault fires in 2012. The US population is five times larger than the UK, but still, that suggests that deaths in house fires due to electrical faults are three times more common in the USA than the UK...and that's despite the fact that the US supply is only 110 volts rather than 240 volts. So, the US standards clearly aren't "reasonable" - I think the word "terrible" would be a better choice! Sorry, User:173.49.11.192 you could not be more incorrect. SteveBaker (talk) 00:10, 21 December 2014 (UTC)
- There could be other reasons for the difference, such as a larger portion of people living in poverty in the US, who can't afford to keep the gas on, so end up using iffy electrical space heaters, instead. I would guess deaths from kerosene heaters are also higher in the US, for the same reason. StuRat (talk) 00:36, 21 December 2014 (UTC)
- There's also a much higher occurrence of wooden or wood-shingle construction in the US (at least from my familiarity with CA and MA homes) than the UK (where essentially no-one lives in a wooden building). Postulating the same incidence of electrical fires between the two, you would expect a higher death rate among the people in the wood buildings than the people in the brick buildings. So a higher death rate from electrically-started fires does not necessarily indicate more dangerous electrics, but may indicate more inflammable houses. -- Finlay McWalterᚠTalk 01:11, 21 December 2014 (UTC)
- I didn't look up the statistics when I made my comment. If your figures are correct, the UK is doing significantly better than the US in terms of preventing electrical fire deaths. However, and not to make light of the personal tragedies behind the statistic, I won't call 560 deaths per year among a population of 300+ millions "dying in droves". By comparison, the number of deaths from road accidents dwarfs that figure, and the risk of road accidents is something that people routinely tolerate. --173.49.11.192 (talk) 00:58, 21 December 2014 (UTC)
- Well, it's a matter of cost-effectiveness. Making cars significantly safer is usually difficult & costly compared to the cost of a car. When we do find a measurable improvement (such as rear-view cameras, and tire pressure sensors), the law does change to force those improvements to become universal. Mandating improved house wiring standards would make an utterly negligible percentage difference to the cost of a house. That said, have you heard the kerfuffle about faulty car airbags - which have only caused maybe 6 deaths and 130 injuries? Government is holding inquiries and all manner of remedies have been proposed. But everyone is silent about the 4x larger death rate from electrical faults. It really doesn't make a whole lot of sense. SteveBaker (talk) 01:56, 21 December 2014 (UTC)
- I think the airbag deaths are considerably higher, they just are going slow on "processing" the claims so it doesn't seem as bad as it is. And a safety device blasting shrapnel into your face seems a lot worse than one which simply fails to stop a fire from spreading. The first is causing death or injury, while the second is only failing to prevent it. StuRat (talk) 07:24, 21 December 2014 (UTC)
- I thought this discussion was about poor electrical practices leading to faults which may cause fires? When did it become about stuff which fail to stop a fire from spreading? Nil Einne (talk) 12:23, 21 December 2014 (UTC)
- I was referring to the wood construction of houses in the US, with no corresponding safety feature required, like sprinklers, which allows electrical fires to spread and become more dangerous. This was part of my explanation for why exploding airbags are more of a concern to the public than house fires. StuRat (talk) 15:27, 22 December 2014 (UTC)
- It was suggested above that 110 V is safer than 240 V. This may be true for the electrocution risk, but the reverse is true for the fire risk. Fewer volts means more amps to deliver the required watts. More amps means more heating of the wiring. That's why in the UK ELV (12 V) halogen downlighters are more strictly regulated than 240 V ones. --catslash (talk) 16:32, 21 December 2014 (UTC)
- Steve, You are mixing old installation with new. The new boxes and outlets are very safe, but the US has a lot of old houses, and it's mostly those that have issues. If everyone had the money to upgrade there would be much fewer problems. So it's not that people ignore it - it's that it takes a long long time for upgrades and improvements. Ariel. (talk) 20:34, 21 December 2014 (UTC)
- Going back to the original question, does anyone know if the NEC has anything to say about the practice of wrapping electrical tape around switches and receptacles? --173.49.11.192 (talk) 17:08, 21 December 2014 (UTC)
- It's completely unnecessary, but many people do it anyway. The worry is if the outlet moves in the wall it might contact the box and short, (or if you want to work on it live, it's easier if it's covered). If you have to work with old boxes, the threads for the screws are worn sometimes, but the outlet might move and touch, but it's definitely not necessary with new boxes. If you plan to work on it live, then you need the tape when you put it back in the wall. In short: If you are working live, do it. If not, then with new stuff don't do it. If you have 100 year old electrical boxes then consider it. Ariel. (talk) 20:19, 21 December 2014 (UTC)
- My understanding regarding fires in American homes due to electricity is that it is in no small part caused by the abortive switch from copper to aluminum wiring (and back to copper) in the second half of the last century and homes with hybrid wiring, which causes corrosion and heating and fires due to the resulting resistance. This article mentions the issue.
- (An addition after the thread was archived): I learned from electricians in the US to wrap switches and outlets with electrical tape. One reason is that when painting the wall, the cover plate of the outlet or switch is likely to be removed, and without the tape, live metal parts are exposed, creating a shock hazard. Paint bridging from the live part to the metal surround may bubble. Switches and outlets are not always ridgedly screwed down against the box, creating the possibility they could swing sideways a bit and create a short to the (hopefully) grounded box. As pointed out above, it is safest to kill the power when replacing a defective switch or outlet, but at times people work hot with 120 volt circuits, and if there is no tape around the hot terminals, it is likely there will be a short when the device is pulled out of the box to replace the switch/outlet. If air circulation is needed for cooling around the spot where the screw secures the wire, then there is some severe problem with a loose connection or oxidized conductor, such as dirty oxidized copper or even very dangerous aluminum branch conductors. Edison (talk) 19:48, 30 December 2014 (UTC)
Why does running water cause the need to urinate?
[edit]Why does the sound of running water cause a person to need to urinate? Or is that just an urban legend? Thanks. Joseph A. Spadaro (talk) 23:50, 20 December 2014 (UTC)
- I think it's real. See mirror neuron. (A zoo moved it's water fountain by the bathrooms and added a plaque saying it was relocated there "due to it's inspiration effect on small children".) As for why this might have evolved in humans, a group of people traveling on foot would stay together and make better time if they all stopped to urinate at the same time. They would also be harder to track by predators, such as wolves, than if they left small puddles of urine every mile. StuRat (talk) 23:52, 20 December 2014 (UTC)
- You might find your answers in our article Classical conditioning. I have the same problem when some post a question about diets and it reminds me that its time to make sure that the food in the fridge is not getting out of date (well that's my excuse) and if someone asks about sex, the wife suddenly remembers she promised to pop over and see her mother.--Aspro (talk) 00:01, 21 December 2014 (UTC)
- OR alert! - In later life I have found that when I run the tap to do the washing up I usually need to go for a pee - maybe it's to put off doing that boring job! I used to think it was just the power of suggestion until a veterinary friend of mine told me that the best way to get a cow to urinate, so you can get a urine sample, is to run a tap. Richerman (talk) 01:10, 21 December 2014 (UTC)
- There's also scent marking in other mammals, such as dogs, which try to urinate to stake their claim whenever they detect urine from another animal. Perhaps we inherited a remnant of that behavior. StuRat (talk) 02:47, 21 December 2014 (UTC)