Jump to content

Wikipedia:Reference desk/Archives/Science/2009 October 9

From Wikipedia, the free encyclopedia
Science desk
< October 8 << Sep | October | Nov >> October 10 >
Welcome to the Wikipedia Science Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 9

[edit]

Convert atmospheric greenhouse gases to flood insurance rates

[edit]

I'd like to see a series of graphs, or maybe I want an applet that can produce them, for describing what changes like "20% less greenhouse gases by 2020" would have on actual flood insurance rates. Where does one start for that sort of thing? 99.62.187.28 (talk) 04:59, 9 October 2009 (UTC)[reply]

I don't think that there is no such document. A 20% reduction in greenhouse gases by 2020 is practically impossible. A 20% reduction in new emissions by 2020 is possible, but will still leave us heading to increased climate change, and the effect in 2020 will be marginal. There are a couple of studies on the effect of climate change on insurers, e.g. by the Association of British insurers[1] and by the US GAO[2]. Your question is vey underconstrained (What market? What time? How are the 20% achieved and what is the state of the economy?). But even with a more specific question, I doubt we have reliable prognoses. You can expect rates to develop to match damages (or a bit more, because the insurers need to pay the increased uncertainty), but our estimate of damages has a large uncertainty (that does not even primarily depend on climate). --Stephan Schulz (talk) 07:56, 9 October 2009 (UTC)[reply]
I doubt that even the insurance companies can make that call yet. The problem is that flood insurance rates are not only determined by the number of people who get flooded - they also depend on the number of people who are NOT flooded who decided to buy insurance and pay premiums. So for example, if the news coverage in 2020 were full of "OMFG!!!! Look at teh stooopid peoplez gettin flooded out (lol!) without insurans!" (because that's how the news will read in 2020). If that kind of publicity caused people who live on higher ground to take out insurance in disproportionately large numbers - then it's possible for the ratio of people who are NOT flooded out to those who are could actually increase - and thereby drive the insurance costs down...not up! It's very likely that the law could change in order to help out the insurance companies by requiring flood insurance on all homes (just as we require 3rd party insurance for cars) - or that banks might require it as a condition of getting a mortgage. We don't know (and cannot reasonably guess) what might happen in that regard. However, we might be able to find the number of houses in zones that would be flooded if CO2 levels don't start to level out soon. We could then (presumably) estimate the value of those homes and make some kind of a guess about the total cost of replacing them. SteveBaker (talk) 13:25, 9 October 2009 (UTC)[reply]
The other thing is that insurance companies don't ensure for certain losses. If you live next to a river that floods every few years - you won't be able to buy flood insurance at any price. As the insurers determine that sea levels are rising - they'll simply cease to offer flood insurance in areas that are 100% certain to be flooded every few years and take every opportunity to cancel existing policies in those regions. They'll make their money on the boundaries of those "now certain-to-flood" areas where houses that were not even at risk of floods before suddenly find themselves in the new 100 year flood-plain. This might also enable the insurers to keep rates stable, no matter the amount of sea level rise. SteveBaker (talk) 13:57, 9 October 2009 (UTC)[reply]
It would probably be pretty straightforward to do a back-of-the-envelope calculation of temperature-sea level (though there is a significant lag in the warming of the oceans), some related info and references are found on the Current sea level rise page here. Carbon dioxide to temperature to sea level would be less straightforward due to the overprinting of natural variability. For inland areas (rivers and streams), I'd say forget it: there are so many factors that go into weather (as opposed to climate), and drainage-basin-scale weather patterns are what really control flooding in streams, that I would say that the natural variability there will outweigh any predictability of change in the hydrologic cycle over a decadal time-scale. Awickert (talk) 14:12, 9 October 2009 (UTC)[reply]
Last week, I attended a lecture by a geophysicist from the US Geological Survey, Ross Stein whose project was to develop a comprehensive risk-assessment profile based on geophysical data about earthquakes. The challenges of accumulating some very heavy scientific data and putting it into a form that insurance and reinsurance conglomerates can understand are huge. The result is a GEM, "a public/private partnership initiated and approved by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD-GSF). GEM aims to be the uniform, independent standard to calculate and communicate earthquake risk worldwide." Their goal is to produce a free and open-source applet/web tool that will allow you to do exactly what you want - estimate the likelihood of catastrophic damage (from earthquakes), as measured in a variety of different paradigms (dollars, casualty rates, seismic magnitude, etc).
Global warming and global sea-change risk falls under the same category - it has huge socioeconomic impact; governments, insurance companies, businesses, and private citizens all have a need to assess the risk; and it is also very hard to quantify the global risk as it relates to a specific region. At the same time, no local region has the resources to coordinate the statistical analysis for these sorts of global-scale data analysis. You might be interested in checking out the GEM project website - to see how one gets started accumulating this scale of information. The seed idea is to address a common need, and the execution of the idea is to interact with the large international organizations (such as the UN, several large insurance conglomerates, and government agencies like the USGS), to coordinate a strategy for real, science-based risk assessment. To my knowledge, no such initiative yet exists for global-climate-change/sea-level-change risk assessment - maybe the OP would like to suggest this angle to the GEM project. Nimur (talk) 14:57, 9 October 2009 (UTC)[reply]

"TAIPEI (AFP) – Global warming will cause the amount of heavy rain dumped on Taiwan to triple over the next 20 years, facing the government with the urgent need to beef up flood defences, a scientist warned Tuesday. The projection is based on data showing the incidence of heavy rain has doubled in the past 45 years, coinciding with a global rise in temperatures, said Liu Shaw-chen of Taiwan's leading research institute Academia Sinica. The estimate comes two months after Taiwan was lashed by Typhoon Morakot, the worst to hit the island in half a century, leaving more than 600 deaths in its wake...." -- http://news.yahoo.com/s/afp/taiwanclimatewarmingtyphoon 98.210.193.221 (talk) 18:11, 16 October 2009 (UTC)[reply]

Why preheat the oven?

[edit]

I don't often use an oven for cooking, but whenever I do, the recipe invariably tells me to preheat it. Why? Surely the sooner the food goes in, the sooner it will be ready to come out. My guess is that we would save a lot of energy if people weren't preheating empty ovens. If preheating is important, maybe the oven article should make some mention of it...--Shantavira|feed me 08:03, 9 October 2009 (UTC)[reply]

Just as a guess: the reason is to keep the energy transfer predictable. Different ovens heat at different rates, and so an oven that heats instantly from 70 degrees to 400 degrees would cook food differently than one that went from 70 to 400 over a period of an hour if you put it in before you started heating the oven. Veinor (talk to me) 08:08, 9 October 2009 (UTC)[reply]
Yes, I also think that's why it's done. Cookbook authors don't want to have to mess around with telling you how to adapt the cook time based on how quickly various ovens heat up. It's easier when creating a recipe to just do a preheat, so the cook time will be the same in all ovens. Red Act (talk) 08:21, 9 October 2009 (UTC)[reply]
While this cookbook author explanation is plausible, also note that cooking is a chemical process. Heating a mix of ingredients during some time at some (more or less) fixed temperature produces a result that might otherwise be impossible to produce by gradually heating, even if the same overall amount of energy is used. You probably also don't want your steak first soaked in cold oil and then gradually having it heated... - DVdm (talk) 10:36, 9 October 2009 (UTC)[reply]
I'd agree with that, if you start cooking meat at too low a temperature, it will boil in its own juices rather than frying. Also your food isn't going to be that cooked by the time your oven comes up to temperature so preheating isn't going to use that much more energy. Smartse (talk) 11:18, 9 October 2009 (UTC)[reply]
The times wouldn't even be constant for a given oven because you might want to first cook one thing - then cook something else, with the oven already hot. You'd never get consistent results. It's also possible that the cooking process might involve multiple chemical reactions that each happen at different temperatures - so by not pre-heating to the temperature where all of the reactions happen at once you might have undesirable results as one reaction starts before the others and last for a longer time. SteveBaker (talk) 12:38, 9 October 2009 (UTC)[reply]
And come to think about it - the energy savings by not pre-heating might not be that great. Assuming your oven is reasonably well insulated (which they seem to be) - most of the cost is in heating the thing up from cold. Once it's already hot, the heating elements only have to provide enough energy to overcome the losses due to inadequate insulation. If you're really bothered about this - your best strategy is probably to pre-heat per instructions - but shut off the heat a little before the cooking time is over, relying on the insulation to keep the oven hot enough for cooking to continue to completion. SteveBaker (talk) 12:42, 9 October 2009 (UTC)[reply]
Some recipes call for pre-heating the oven, some do not. Often these recipes were written decades ago. It's not about energy savings, it's about what trial-and-error has determined to be the best way to cook something. →Baseball Bugs What's up, Doc? carrots 12:51, 9 October 2009 (UTC)[reply]
Indeed, and sometimes very odd or pointless sounding things have a huge affect on the overall food chemistry. Most bread recipes call for your to spray water in the oven first, something my wife skipped for a long time because she thought the effect would be minimal. It turns out that it really affects the quality of the crust to have that extra humidity in there. --Mr.98 (talk) 12:58, 9 October 2009 (UTC)[reply]
The predictability thing is certainly the main reason (same reason recipes call for unsalted butter and a certain amount of salt -- they don't know how salty your butter is). Another possible reason is safety: the "danger zone" for food spoilage is 40-140 degrees F, and putting the food in a hot oven will minimize the time it spends in that zone. --Sean 13:35, 9 October 2009 (UTC)[reply]
Analogous to this is that when cooking something on the stove, sometimes you heat it to a boil first before adding other stuff, other times you bring the whole thing slowly to a boil. →Baseball Bugs What's up, Doc? carrots 13:50, 9 October 2009 (UTC)[reply]
If you are concerned about saving energy, I always put things in the oven before it preheats, and my baked goods turn out fine (so long as I keep an eye on them). The stovetop not so much - noodles, for example, will waterlog if you put them into the water too early, so I generally boil first (unless I'm in a hurry and don't care about the consistency). Awickert (talk) 14:02, 9 October 2009 (UTC)[reply]
Yuck. Dauto (talk) 19:03, 9 October 2009 (UTC)[reply]
Well, you have to understand, I have no feeling in half of my mouth, so it is less of a problem. But yes, I don't do that when cooking for other people. Awickert (talk) 08:37, 11 October 2009 (UTC)[reply]

When I worked as the oven man in a bakery the oven cycle was so fast that pre-heating became difficult, and I actually found that the ambient air temperature had to be taken into account regarding the cooking time to get reliable results. Pre-heating avoids the need for such calculations.Trevor Loughlin (talk) 14:21, 9 October 2009 (UTC)[reply]

An oven thermometer or judicious use of the temperature control, or listening for a click when the setpoint is reached, can tell you how fast a given oven heats up. Some dishes are more demanding of a certain high temperature than others. I would expect that cakes and breads are more demanding, or that initial lower temperature will have more of an effect. My oven takes about 6 minutes to get to 400 F. Putting a frozen pizza or some such in before the setpoint is reached saves a couple of minutes compared to waiting for the setpoint to be reached, and if you need to eat and run it makes some sense. Certainly the minutes at 300 degrees will not contribute as much to the browning as the minutes at 400, but a minute at 300 F does contribute somewhat. The total time in the oven needs to be increased because of the lower than desired initial temperature, but the food gets to the table faster than if I wait for the oven to completely preheat. If the box says "12 to 15 minutes" I basically use the longer extreme and put the food in at 300 F. It should save energy, because the (gas) oven is not a sealed insulated container. There is a vent which has hot air escaping, and makeup air enters at the bottom. The oven cycles on and off during the baking cycle. For bread, I would wait for the desired temperature. For a roast or a chicken, I will wind up judging doneness by a meat thermometer, so I put it in when I turn the oven on. Its going to be a very long time anyway, and that gets dinner on the table sooner. Edison (talk) 15:39, 9 October 2009 (UTC)[reply]
Yoicks! The reason for preheating the oven is that the results often suck if you don't. Lots of baked things require the outer surface to be heated a lot more than the inside in order to come out right, and that won't happen properly if the oven isn't preheated. For some things, such as a baked potato wrapped in foil, it doesn't matter, but if you try to bake bread in a non-preheated oven, the results will be pathetic. Looie496 (talk) 17:29, 9 October 2009 (UTC)[reply]

I never ever pre-heat the oven. It's a waste of natural gas. Vranak (talk) 20:10, 9 October 2009 (UTC)[reply]

In addition to the culinary points made above, the oven itself may behave differently during preheating: for example, both the top and bottom elements may come on until the temperature difference from the set-point is small. If what you're cooking is only supposed to be heated from below, you don't want that. --Anonymous, 04:47 UTC, 2009-10-10.

This is why I preheat my toaster oven (Doubtless wasting untold electrical energy.), because if I don't it completely fails to cook anything evenly. (Whether it's bagels or instant pizzas.) 15:04, 10 October 2009 (UTC)

Wow, no one has actually given the main point yet. When active, the burners in an oven are locally much hotter than set point of the oven. As a result, they generate a lot of excess infrared radiation. This radiation will tend to char the outside surface of your food. For some foods this is fine, for others it will significantly impacts the taste and quality of the food. Cooking with infrared radiation is called broiling, and is often preferred for meats, but it is distinct from baking where the goal is to cook with hot air. By preheating the oven you create the pocket of hot air desired for baking and then the burners turn themselves down/off. By not placing food in the oven until after the oven's burners are reduced you minimize the food's exposure to the excess infrared radiation that could char your food. (In some situations you can also do this by covering the food with aluminum foil from all sides.) Preheating really is about controlling the way in which your food is cooked. Dragons flight (talk) 23:44, 10 October 2009 (UTC)[reply]

That makes sense about the evenness-unevenness, but (pardon my ignorance, as an excuse I work entirely in the solid or liquid state) I would assume that the air also heats food by giving off IR radiation, albeit evenly. Is this correct? Or is it more of a molecular collision type heat transfer deal? Awickert (talk) 08:37, 11 October 2009 (UTC)[reply]
At typical baking temperatures, e.g. 350F, the primary mode of heat transfer is still conduction by air molecules colliding with the food. Dragons flight (talk) 19:41, 11 October 2009 (UTC)[reply]
OK, thanks, and I should have seen that - of course there is so little blackbody radiation at 350K that conduction would dominate. Thanks again, Awickert (talk) 01:38, 12 October 2009 (UTC)[reply]

Design with dimensions

[edit]
I completed this animation but the valves are not well shown. I hope this helps. Cuddlyable3 (talk) 13:22, 9 October 2009 (UTC)[reply]

I want to know where I can find the exact design of any engine(especially ic engines) with dimensions and specifications. I want any engine to be part of my mini project which I will doing with AutoCAD.To do my project i.e the ddrawing I need to know the exact dimensions of the valves and other components.Please help! —Preceding unsigned comment added by 122.169.170.8 (talk) 11:37, 9 October 2009 (UTC)[reply]

I was going to try and help you by measuring the pieces (to the right), but I can't get the engine to stop moving. User Grbrumder (talk) 00:19, 10 October 2009 (UTC)[reply]
Not sure this is what you meant, but at least in Firefox, pressing the Esc key will stop the animation.–RHolton22:26, 13 October 2009 (UTC)[reply]
Esc also stops the animation in Vista Explorer. You have to reload the page to get it moving again. You can look at the individual frames of the animation in a suitable image editor such as GIMP which is free software. Cuddlyable3 (talk) 12:35, 15 October 2009 (UTC)[reply]
Unfortunately, even a single cylinder motorcycle engine, like the one shown here: [3], is of such complexity that you couldn't possibly show all the dimensions on a single diagram (of reasonable size), and still have them be readable. StuRat (talk) 18:35, 13 October 2009 (UTC)[reply]
I want atleast the basic dimensions like the diameter of the piston head and radius of crankshaft...the stiffness of the spring in the valve and the materials used in the construction of engine.

Shouldn't the list of artificial objects on the moon be edited?

[edit]

LCROSS's Centaur upper stage is up there now. —Preceding unsigned comment added by Dakiwiboid (talkcontribs) 12:43, 9 October 2009 (UTC)[reply]

So what's stopping you? →Baseball Bugs What's up, Doc? carrots 12:50, 9 October 2009 (UTC)[reply]
Indeed "Wikipedia - the free encyclopedia that anyone can edit.". Given the speed it whacked into that crater, there may not actually be much of it left on the moon! However, I agree that it belongs on that list. SteveBaker (talk) 13:11, 9 October 2009 (UTC)[reply]
It wouldn't be the first lunar object that was intentionally augered in. In fact, all of the ejected LM's would have crashed. "Rest In Pieces". →Baseball Bugs What's up, Doc? carrots 13:19, 9 October 2009 (UTC)[reply]
WP:BOLD Ks0stm (TCG) 15:04, 9 October 2009 (UTC)[reply]

LCROSS moon impact

[edit]

It has been over 6 hours since the impacts. Earlier, it was predicted that 10 inch telescopes or larger in the Western US should see a plume. Apparently the scientific "shepherding" LCROSS craft following the initial impactor saw no plume. Did Hubbell or any observatory see anything? When is NASA expected to release initial analysis of the instrumentation findings from LCROSS? It seems like a replay of the Ranger 7 lunar impact spacecraft from 1964, except that the older mission beamed back clearer pictures. Edison (talk) 17:50, 9 October 2009 (UTC)[reply]

I know this is the Internet age and all, but I'd relax and wait a few days. From the CNN article: "NASA said Friday's rocket and satellite strike on the moon was a success, kicking up enough dust for scientists to determine whether or not there is water on the moon." Comet Tuttle (talk) 18:04, 9 October 2009 (UTC)[reply]
While I trust NASA not to completely lie, it doesn't seem unreasonably to expect them to put a PR spin on the event by calling a disappointing result "a success". I don't know if this is really the case, but it is at least possible. StuRat (talk) 17:48, 13 October 2009 (UTC)[reply]
"I Aim at the Stars... but sometimes I hit the moon."--Mr.98 (talk) 20:36, 9 October 2009 (UTC)[reply]
The plume (which, incidentally was expected to be made up of about 350 tons of moon-rock!) was supposed to be clearly visible on amateur telescopes down to maybe 10" - so it ought to have been pretty big. If there was anything to be seen, it should have been seen from earth. But the idea of the LCROSS widget was that it would fly through the plume - getting a direct sampling of it...shortly before crashing itself and kicking up another 150 tons of moon. I heard speculation this morning that the crater might actually be a LOT deeper than was originally thought. These tests are really amongst the most important ever done beyond the earth's orbit - the answer to the question of whether there is water on the moon in readily 'gettable' quantities should be molding mankind's entire manned spaceflight future. It's doubly important that it be found in the moons polar regions where 24 hour per day sunlight would be available for solar panels to turn it into breathable oxygen and hydrogen+oxygen for rocket fuel. It is literally the case that if we find water there - we go back to the moon next - and if not we abandon it as a largely uninteresting desert and head off to Mars. SteveBaker (talk) 21:32, 9 October 2009 (UTC)[reply]
Thanks Steve, where can I find a reference for that? User Grbrumder (talk) 00:23, 10 October 2009 (UTC)[reply]
Our own article about LCROSS#Mission covers the territory pretty well and has lots of good references. SteveBaker (talk) 01:44, 10 October 2009 (UTC)[reply]
15.5 hours post-impact. Thud. Dead. No data.Edison (talk) 03:22, 10 October 2009 (UTC)[reply]
There's a difference between 'no data because it didn't work' and 'no data released to the public yet because they haven't been fully analysed and only twenty people on the planet would have any use for the raw numbers.' It took 15 years for the meat of the Ardipithecus studies to be published; I doubt if NASA will take quite that long to fulfil its legal obligation of publishing the LCROSS results. In the meantime, try to cultivate some zen-like patience :-) . 87.81.230.195 (talk) 07:40, 10 October 2009 (UTC)[reply]
I realised no one properly answered your question. Various sources have said about 2 weeks or more. E.g. [4] [5]. I agree with you that the visible results were fairly disappointing. And it does seem to me this was not just overenthusiastic media or people used to movie visual effects but NASA themselves. They seemed to be hyping the mission and the likely visuals. They seemed to be supporting LCROSS parties and didn't do anything to discredit the claim you'd likely see something with a good enough telescope. They realised the video showing the plume and didn't say anything about likely not actually seeing any plume. And it seems I'm not the [6] [7] [8] [9] [10]. I personally only watched NASA TV. There are some images now [11] which do show something, if they'd shown these around the time (perhaps a few minutes afterwards) I think people would have been more satisfied but that didn't happen. It seems the most 'exciting' thing at the time was the high-five incident. A lot of science is decidedly unimpressive and the media are guilty of overhyping a lot of stuff but in these cases it was the scientists or perhaps more accurately the PR people for the scientists who were a big part of the problem. (To use a different example with the LHC I did watch part of the launch briefly. There was little to see of course but that was what I was expecting.) Nil Einne (talk) 06:40, 11 October 2009 (UTC)[reply]

(outdent) From what I hear from the science folks, the lack of a highly visible plume may actually be a good thing: it may mean that the impactor hit in a deep patch of lunar regolith deep in the crater, giving probably the best set of spectra to answer the water question. I'm sure it will be a while before preliminary results are released: I can imagine that deconvolving complex spectra can be difficult especially if you want to be absolutely correct before releasing it to the news media. Awickert (talk) 08:41, 11 October 2009 (UTC)[reply]

I'm also curious as to why the plume is less visible than anticipated. I suspect that this means that the plume did not leave the shadow in the crater and rise into the sunlight. This, in turn, could be caused by the impact being slower than anticipated or at a shallower angle. I assume that NASA is able to control the velocity fairly precisely. Perhaps there was a protrusion which it struck at a shallow angle (maybe the central cone which many meteor craters have ?). Something like this:
          |
          | 
          !/\
__________/  \___________ 
So, does that crater have a central cone, and could the impact have happened there ? StuRat (talk) 17:48, 13 October 2009 (UTC)[reply]

are quantum effects the sole reason why heat capacity goes to zero at 0K?

[edit]

I get the idea that the amount of possible microstates decreases as you go to 0K, which explains why heat capacity decreases, and that quantum effects simply enhance this. But my prof tells me quantum effects are the sole reason. If so, would the increase in microstates w/respect to temperature be responsible for the inflection point in the heat capacity versus temperature graph? John Riemann Soong (talk) 20:37, 9 October 2009 (UTC)[reply]

I don't think the concept of counting microstates makes sense outside quantum mechanics -- a classical system does not have a finite number of states for a given energy. And what do you mean by inflection point here? Looie496 (talk) 21:42, 9 October 2009 (UTC)[reply]
There's an inflection point (point at which the derivative is 0) at 0K in the heat capacity as a function of temperature. See the graph at Specific heat capacity#Solid phase. Red Act (talk) 22:26, 9 October 2009 (UTC)[reply]
As a former calculus teacher I have to object that an inflection point is defined as a point where the second derivative changes sign, not a point where the derivative is zero. Basically "inflecting" means changing from upward curvature to downward curvature, or vice versa. You can't have an inflection point at the edge of a function's domain. Looie496 (talk) 03:38, 10 October 2009 (UTC)[reply]
Oops! My bad! Red Act (talk) 04:45, 10 October 2009 (UTC)[reply]
For the theory of the temperature dependence of heat capacity at low temperatures, see Debye model. Red Act (talk) 22:26, 9 October 2009 (UTC)[reply]
I don't understand all the finer details of that article. I know that there is a quantum contribution to the reduction in heat capacity, but what about the microstate-macrostate-entropy contribution? And you can count microstates classically -- via probability and statistical mechanics. John Riemann Soong (talk) 22:45, 9 October 2009 (UTC)[reply]
The quantum contribution is the microstate-macrostate entropy contribution. Assume that heat capacity is positive. Then as temperature decreases, so does internal energy. As internal energy decreases, the energy level spacings in a given quantum system will "look bigger". In the limit that internal energy goes to 0, the energy level spacings will look infinitely large, so there is only one "possible" microstate: every particle in its lowest possible energy level. Thus in the limit that internal energy (and likely temperature) goes to 0 in a quantum system, entropy should also go to 0. In a classical situation, this argument doesn't work because the possible energy levels are distributed in a continuum. Someone42 (talk) 11:28, 10 October 2009 (UTC)[reply]

biosynthesis of creatine

[edit]

The article creatine shows an R-NH3+ cation attacking an imine C=N center (that's the only way to form the new C-N bond).... but ammonium cation can't possibly have any lone pairs to donate. Am I missing something? John Riemann Soong (talk) 22:59, 9 October 2009 (UTC)[reply]

It's biochemistry, so the answer is almost always "some enzyme":) The article says (and links to) Arginine:glycine amidinotransferase. There, you can see individual steps of the catalytic cycle, including the exact fate of one of the H on the R-NH3+. DMacks (talk) 07:15, 10 October 2009 (UTC)[reply]