Wikipedia:Reference desk/Archives/Science/2014 July 2
Science desk | ||
---|---|---|
< July 1 | << Jun | July | Aug >> | July 3 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
July 2
[edit]Enthalpy of formation
[edit]Is it possible to calculate a vapoural heat of formation of when IR spectroscopic data is the only experimental data available? Plasmic Physics (talk) 00:23, 2 July 2014 (UTC)
- Maybe, but I don't see how that would be possible. For measuring heat of formation, calorimetry is the normal technique. 24.5.122.13 (talk) 01:11, 2 July 2014 (UTC)
- You mean if the ONLY thing you know is a single ir spectrum? No. But IR does capture TEMPERATURE information - I hope that is clear to you. So, theoretically, there are probably some gas phase reactions which have sufficiently strong and sufficiently sensitive peaks in the IR so that you could determine the initial and final composition and temperatures. Generally, IR is only semi-quantitative, meaning that its good for composition to 5% (optimistically) and maybe ½% in very good circumstances. This means that IR would be a poor method if you wanted accuracy to 1% or better. (And this ignores the issue, which I am not competent to deal with of the accuracy of the temperature measurement using IR. I know that IR guns are commonly used to measure temp., and I know they can be calibrated to within a couple of degrees, but I don't know how much more accurate a calibrated lab IR spectrometer can be (since a couple of degrees is probably not good enough).) Could you get "an answer" ? Sure. How accurate would it be? Depends, but without a lot of calibrating, I doubt it would be more than plus or minus a factor of 10 of the right Hf...maybe a factor of +/- 5X ... Your question basically doesn't give sufficient context for an answer. A lab IR uses IR to excite various modes, while temperature is all about the existing modes (or black-body radiation emission). Meaning if you want to see if something in a room is giving off light, you don't turn on more lights, you turn all the lights off... — Preceding unsigned comment added by 173.189.75.163 (talk) 15:54, 2 July 2014 (UTC)
- What a shame, thanks though. Plasmic Physics (talk) 04:23, 3 July 2014 (UTC)
Will Ebola become a global pandemic?
[edit]There are about 600 Ebola cases in West Africa and it seems that the number of cases is doubling every month. At this rate, in about a year there could be a million cases and in two years time there could be more than a billion cases. With an incubation period of up to 21 days, what is going to stop this virus from getting to Asia, the US or Europe? Count Iblis (talk) 18:11, 2 July 2014 (UTC)
- Our Ebola article states that "The potential for widespread EVD epidemics is considered low due to the high case-fatality rate, the rapidity of demise of patients, and the often remote areas where infections occur". It also says that "Due to lack of proper equipment and hygienic practices, large-scale epidemics occur mostly in poor, isolated areas without modern hospitals or well-educated medical staff". Your 'doubling every month' prediction (which isn't actually borne out by the latest data - see 2014 West Africa Ebola outbreak#Temporal evolution) is predicated on the same lack of appropriate medical care occurring elsewhere. AndyTheGrump (talk) 18:25, 2 July 2014 (UTC)
- As Andy says, Ebola epidemics are usually more or less self-contained because of the rapidity of the death of patients, and because the epidemics tend to occur in isolated areas, where the patients are unlikely to go to the US, Europe, or Asia. If the patients come in contact with visitors from an industrial country who then return home and develop Ebola, they will be treated in isolation units to control the spread. Influenza is far more of a pandemic threat than Ebola or similar hemorrhagic fevers. Get your flu shot this coming fall. (If you are in the Southern Hemisphere and haven't gotten your flu shot yet, it isn't too late.) Robert McClenon (talk) 18:54, 2 July 2014 (UTC)
- Yes. Ebola kills hundreds per year and may expand to thousands. Flu kills hundreds of thousands and has sometimes expanded to millions. Our OP is relying on news to guide action. A disease makes news when it is interesting, thus odd, thus small. Newspeople aren't in the business of guding action. Jim.henderson (talk) 19:06, 2 July 2014 (UTC)
- Also note that increasing at a rate, were it to continue, which would rapidly infect the entire human population, is a feature of the early stages of many diseases or disease strains. That rate of increase inevitable slows down, as it moves out of the most vulnerable populations, survivors develop immunity, and people start taking precautions, like breathing masks, in the case of SARS. Of course, we should avoid getting lazy and just assuming it will slow down, but rather study it and actively find ways to slow it down. StuRat (talk) 19:11, 2 July 2014 (UTC)
- See Richard II Act II Scene 1:
- John of Gaunt:
- ...His rash fierce blaze of riot cannot last,
- For violent fires soon burn out themselves;
- Small showers last long, but sudden storms are short;
- He tires betimes that spurs too fast betimes;
- With eager feeding food doth choke the feeder:
- Light vanity, insatiate cormorant,
- Consuming means, soon preys upon itself.
- (Reminding all of WP:NOTCRYSTAL) -- the question in the header cannot be answered with references. I will not answer this question, but instead provide references relevant to the topic.
- If OP would like reliable information on the current Ebola situation, he should look at WP:RS such as the USA's CDC page on the topic [1], or the UK's HPA [2]. If you prefer an NGO perspective, check out info from Doctors without borders here [3], [4]
- Since this is "breaking news" in the field of epidemiology, there are very few peer-reviewed papers published on the current outbreak, and reports from those agencies are probably the best sources. If you want to see some of the science of Ebola-specific control and mitigation, here's a paper on an Ebola vaccine and vaccination strategies [5]. For the general topic of control strategies for multiple outbreaks, see this nice paper on epidemic control in a more disease-agnostic sense [6]. SemanticMantis (talk) 19:53, 2 July 2014 (UTC)
- Hygiene, communication and medicine are advanced enough today to contain any epidemic virus that spreads tru physical contact and/or exchange of body fluids. Only airborne contagious Vira have a high potential to cause a Global pandemic. Thus not Ebola. --Kharon (talk) 21:24, 2 July 2014 (UTC)
- "Viruses" if you're writing in English, "virus" (plural) if you're writing in Latin. "Vira" is entirely beyond the pale, as there is no Latin declension that has "-us" in the singular and "-a" in the plural. "Virii" (the most common error) has at least the excuse of being plausible. Tevildo (talk) 21:59, 2 July 2014 (UTC)
- Virus isn't originally Latin. It is originally Greek. I don't know Greek plurals. However, Tevildo is right about the Latin. If virus is fully naturalized in Latin, it is fifth declension, and its plural is virus. So if it is fully naturalized in English, and it is, its plural is viruses. Any other plural is just a misguided attempt to use a classical language without understanding the classical language. Fortunately, this has not been litigated, so that no one needs to be warned of discretionary sanctions. But do not edit war over the plural. It is viruses, unless the rest of the article is in Latin. Robert McClenon (talk) 03:22, 3 July 2014 (UTC)
- No and no. Vīrus is not from Greek (how could it be? - classical Greek had lost its w sound before the Roman period). It's a native Latin word. The equivalent Greek word is īos, both the Latin and Greek forms representing regular development from something like like *wīsos. And it's not fifth declension. The genitive singular is documented as vīrī, making it second declension. In fact it's an almost unique example of a second declension neuter noun in -us: nominative and accusative both vīrus, dative and ablative apparently not found, and more relevant to the current question, it never occurs in the plural. --rossb (talk) 16:14, 3 July 2014 (UTC)
- Robert may have been led astray by the modern barbarism "virion", which is no better and no worse than "television", when it comes down to it. Did we not at one time have a very long section on the etymology of "virus" in our article, or did I read it elsewhere? Tevildo (talk) 21:56, 3 July 2014 (UTC)
- No and no. Vīrus is not from Greek (how could it be? - classical Greek had lost its w sound before the Roman period). It's a native Latin word. The equivalent Greek word is īos, both the Latin and Greek forms representing regular development from something like like *wīsos. And it's not fifth declension. The genitive singular is documented as vīrī, making it second declension. In fact it's an almost unique example of a second declension neuter noun in -us: nominative and accusative both vīrus, dative and ablative apparently not found, and more relevant to the current question, it never occurs in the plural. --rossb (talk) 16:14, 3 July 2014 (UTC)
- Virus isn't originally Latin. It is originally Greek. I don't know Greek plurals. However, Tevildo is right about the Latin. If virus is fully naturalized in Latin, it is fifth declension, and its plural is virus. So if it is fully naturalized in English, and it is, its plural is viruses. Any other plural is just a misguided attempt to use a classical language without understanding the classical language. Fortunately, this has not been litigated, so that no one needs to be warned of discretionary sanctions. But do not edit war over the plural. It is viruses, unless the rest of the article is in Latin. Robert McClenon (talk) 03:22, 3 July 2014 (UTC)
- To answer the original question, it's not as if the locals are doing nothing; in Liberia, the legislature appropriated a huge portion of the national budget to fight it (if I remember rightly), and although the news outlets currently aren't talking as much about Ebola as they were a few months ago, it still sometimes makes the headlines; this story, for example, is on the main page of the Heritage website, while the Liberian Observer hasn't been running any above-the-fold stories about it in the last few days. A big contributor to the spread of such a disease is ignorance — people who are aware of it will be more careful, and that alone will reduce transmission rates, especially with a disease like this that requires some sort of contact with a patient. Nyttend (talk) 02:06, 3 July 2014 (UTC)
- "Viruses" if you're writing in English, "virus" (plural) if you're writing in Latin. "Vira" is entirely beyond the pale, as there is no Latin declension that has "-us" in the singular and "-a" in the plural. "Virii" (the most common error) has at least the excuse of being plausible. Tevildo (talk) 21:59, 2 July 2014 (UTC)
- Hygiene, communication and medicine are advanced enough today to contain any epidemic virus that spreads tru physical contact and/or exchange of body fluids. Only airborne contagious Vira have a high potential to cause a Global pandemic. Thus not Ebola. --Kharon (talk) 21:24, 2 July 2014 (UTC)
Air conditioner settings
[edit]I have a window air conditioner that has only two dials. One is a dial that can be turned from max cooling (7) to min cooling (1). The other dial has four settings, labeled "low cool", "high cool", "low fan", and "high fan". What is the difference between "low cool" and "high cool" (the manual is slim and gives no useful information on this)?
- One of my friends argues that the words "low" and "high" in "low cool" and "high cool" refers to the fan, so that compressor usage is the same in both settings, with only fan speed differing. In other words, he argues that the settings mean:
- "low cool" - compressor on; low fan speed
- "high cool" - compressor on; high fan speed
- "low fan" - compressor off; low fan speed
- "high fan" - compressor off; high fan speed
- Another of my friends argues that the words "low" and "high" in "low cool" and "high cool" refers to compressor usage, and that "high cool" uses the compressor more than "low cool" does. In other words, he argues that the settings mean:
- "low cool" - compressor on less; fan off
- "high cool" - compressor on more; fan off
- "low fan" - compressor off; low fan speed
- "high fan" - compressor off; high fan speed
Who is right?
—SeekingAnswers (reply) 21:54, 2 July 2014 (UTC)
- We can't predict the response of an unknown model of air conditioner. You could specify the model, but I don't understand why you can't just listen to what the fan sounds like (surely it is more than loud enough, and even a deaf person would feel it blowing and also feel the vibration of the compressor) Wnt (talk) 22:14, 2 July 2014 (UTC)
- Re: "We can't predict the response of an unknown model of air conditioner." Googling suggests these setting labels are fairly standard for many air conditioners. There would be a standard meaning to them, then, no? And no matter what the setting, there is sound, there is air, and there is vibration, so I can't tell the difference between them that way. —SeekingAnswers (reply) 22:20, 2 July 2014 (UTC)
- (e/c) Top option. The compressor is only 'On' or 'Off'; the fan has 2 speeds in this case. (In my case, it set so that when the compressor is on, the fan is high, and when the compressor is off, the fan is low; for circulation, air filtering and even distribution of temp/humidity). This link provides some general info: [7] (A web search can easily find plenty of other links). BTW, the fan doesn't use much electricity, especially when compared to the compressor. —71.20.250.51 (talk) 22:26, 2 July 2014 (UTC)
- Agreed on top option. However, on a car, I might suspect that "High cool" would mean recirculate mode, while "Low cool" means fresh air. But, I believe most window A/C units only have recirc mode, as fresh air would require screens/filters, etc. StuRat (talk) 02:23, 3 July 2014 (UTC)
Okay, after changing around the settings and listening carefully as User:Wnt suggested, I agree with User:71.20.250.51 and User:StuRat that top option is correct: when set to "high cool", I can hear the compressor periodically automatically turning off, presumably due to the thermostat detecting that it has reached the desired temperature set by the other dial, at which point it sounds exactly like "high fan". But now I have a new question. When initially turning the air conditioner on, the manual recommends "high cool" to cool down the room to the desired temperature as quickly as possible. My question, however, is about what to do once the room cool downs to the desired temperature: given that the compressor automatically turns off when it detects that the desired temperature has been reached, at that point, would it be more energy-efficient / cheaper for my electricity bill to leave the air conditioner on "high cool" or to switch to "low cool"? First of all, everyone agrees that the compressor comes on or off automatically based on the thermostat, and that the compressor costs far more electricity than the fan. However, that still leads my friends to different conclusions:
- Friend #1 argues that once you only want to maintain a temperature rather than cool further, "high cool" is more energy-efficient and cheaper than "low cool", because the fan uses so little electricity in comparison to the compressor that all that matters is how often the compressor comes on. He argues that "high cool", with the fan running at high speed, will do more to circulate the cool air, therefore doing more to maintain the cool temperature, and therefore causing the compressor to automatically turn on less often and thereby saving more electricity.
- Friend #2 argues that once you only want to maintain a temperature rather than cool further, "low cool" is more energy-efficient and cheaper than "high cool", because the fan is running at a lower speed, which obviously consumes less electricity. He rejects friend #1's argument about the fan running at high speed doing anything to maintain a cool temperature to cause the compressor to turn on less often. In other words, friend #2 believes how often the compressor automatically turns on/off is entirely unaffected by fan speed.
- Friend #3 agrees with friend #2 that once you only want to maintain a temperature rather than cool further, "low cool" is more energy-efficient and cheaper than "high cool". However, he differs from friend #2 in also accepting friend #1's argument that the fan running at high speed will cause the compressor to turn on less often -- but he ultimately still agrees with friend #2 that "low cool" is more energy-efficient and cheaper than "high cool" because he believes the effect of greater circulation is very small and he does not believe that the compressor will turn on less often enough to compensate for the higher cost from running the fan at high speed.
Now who is right among these 3 viewpoints?
—SeekingAnswers (reply) 04:07, 3 July 2014 (UTC)
- It depends on the size and layout of your room. In a small square room, with the A/C in the center of one wall, and nothing in the way, the low fan is probably adequate. But in a large, irregular shaped room with obstacles, like a big-screen TV in the center, more fan speed might be required to distribute the "coolth". I've even supplemented the anemic fan in my window A/C unit with a box fan blowing across the vents.
- I'd try "low", and if it isn't getting the job done, then switch to "high". If that still results in a cold spot by the A/C and it being hot in the rest of the room, then you might want to consider my solution. StuRat (talk) 04:29, 3 July 2014 (UTC)
- My experience based on a lot of motels with that sort of air conditioner is that "high" is usually too damn noisy, and that's enough reason to prefer "low" if possible. --50.100.189.160 (talk) 19:47, 3 July 2014 (UTC)
- I'm not asking "which gets the job done" -- both "low cool" and "high cool" successfully maintain the cool temperature of the room -- I'm asking which of them does it with less electricity, as several arguments have been given as to why either might use less electricity. —SeekingAnswers (reply) 02:08, 4 July 2014 (UTC)
- I think this depends on where the thermostat actually is (the part taking the measurement) and if it is drawing in outside air, the temperature of the air. If the air outside is hotter and being drawn in, the more you have the fan on the more you're drawing in the heat from outside. If the air is from inside, the metal is probably still transmitting more heat in from outside when more air is blowing through it but I doubt it is much. Now when the air passes over the compressor, it warms it up, and a warmer compressor should be able to cool by 1 degree using less energy than a cold compressor, per Carnot cycle. OTOH who knows what really is going on inside it? But if the thermostat measures the exact temperature of the compressor gas then the compressor will be the same temperature when on no matter how much air is passing over it; it just will be warmer in the room when the setting is to a lower fan. (I doubt that's the case, but again, who really knows?) That said, the choice of fan setting for many might have more to do with the amount of noise it produces (some of us don't even use those things because the heat is less annoying), or what it feels like to stand directly in front of it, or whatever. Wnt (talk) 04:38, 3 July 2014 (UTC)
- While the compressor is running, its efficiency depends (in part) on how many air molecules contact the heat exchanger (coils w/fins, like a car's radiator) –which, of course, depends on the fan speed. So, while the compressor is running, the fan should be 'high'. I'd suggest finding the manual for your specific thermostat model, which is probably online if you don't have a copy. There might be an automatic mode where the fan continuously runs at low speed and switches to high speed when the compressor comes on. Something like 'Auto low'; ('Auto high' would have the fan always on high). The "best" setting depends on a lot of things. For example, if you live in a very hot location (e.g. Phoenix) and you have an older home with non-insulated ducts in a very hot attic, then having the fan run continuously would not be recommended -at least not during mid-day. However, when the compressor turns off, the heat exchanger is still cold (and probably has some ice on it) -and if the fan shuts off at the same time, then this "thermal mass" is wasted, and literally goes down the drain as the ice melts. This effect only lasts for a few (5~10?) minutes with the fan on, so efficiency also depends on how often the compressor turns on/off. —I hope this helps, ~E:71.20.250.51 (talk) 05:35, 3 July 2014 (UTC)
Iron as essential nutrient, rust, and water pipes
[edit]Two parts to this question:
- If iron is an essential nutrient, does that mean people can just eat things like iron bolts? Yet no one eats iron bolts, and eating pieces of iron instinctively seems to me to be a very bad and unhealthy idea. Similarly, rust is just iron and oxygen, the former of which is an essential nutrient and the latter which we need to breathe. So, again, wouldn't that mean it would be a good idea to eat rust? But again, no one eats rust, and eating rust instinctively seems to me to be a very bad and unhealthy idea.
- If water pipes are made of iron, doesn't that mean they would rust? And wouldn't that mean our drinking water is full of rust? Wouldn't that bad for people's health, if indeed the answer to the previous question is that eating rust is a bad/unhealthy idea?
—SeekingAnswers (reply) 22:09, 2 July 2014 (UTC)
- One could get dietary iron by eating iron or rust in bulk. But it will likely damage the person. Consult your doctor if you need more iron or not. Some people need to consume less iron. But you don't need that much that you need a bolt's worth. If you get a breakfast cereal wheat biscuit and run a magnet through it you should be able to extract the iron added by the manufacturer. Too much iron degrades the water quality, leading to bad taste and staining. Graeme Bartlett (talk) 22:59, 2 July 2014 (UTC)
- (1) Generally, eating pieces of metallic iron is a bad idea because of the physical damage they can cause to interior organs. However, some prepared foods contain "reduced iron", which is essentially just very finely powdered iron that dissolves rapidly in stomach acid.
- (2) Yes, steel water pipes DO rust (sometimes pretty rapidly, if used for very hot water), so your drinking water MAY be full of rust. This, however, does NOT impact people's health, because the rust particles are very small and dissolve in the stomach -- but it DOES impact your home repair bill when the pipe eventually rusts right through and floods your whole house with scalding hot water (as happened to me on one occasion). 24.5.122.13 (talk) 23:05, 2 July 2014 (UTC)
- I don't know how common the practice actually was at the time, but at least in the "Law of the Land" episode of Dr. Quinn, Medicine Woman, Dr. Quinn tells someone suffering from iron-deficiency anemia to boil rusty nails in water and then drink the rusty water.[8] Red Act (talk) 00:36, 3 July 2014 (UTC)
- Note that an Iron overload can be quite dangerous, so eating an entire iron bolt could be very bad for that reason, too. Also, an iron bolt probably contains additives that are toxic. And if you've every tasted rust, you probably spit it right back out. It tastes horrid.
- You might also be interested in pica (disorder)#Causes, a condition in which people eat strange things, like bolts, at times due to a deficiency in some mineral like iron.
- As for pipes rusting, they often accumulate scale (minerals that come out of solution from the water and stick to the pipes), so that can protect the iron from the water and slow down rusting. But, if your water is shut off, when it comes back on you may very well notice a rusty color for a bit. StuRat (talk) 01:09, 3 July 2014 (UTC)
- The Iron poisoning article talks about how much iron you can eat before it has toxic effects. From the numbers that article gives, it sounds like even a rather small iron bolt, finely ground up to aid in digestion and to avoid poking a hole in your throat or something, would be too much to safely eat at once. Monsieur Mangetout ate a lot of bolts during his lifetime, but he was a rather special case who shouldn't be emulated. Please don't eat bolts, kids.
- See also Human iron metabolism. Red Act (talk) 02:24, 3 July 2014 (UTC)
- As much of iron goes to blood, blood is a good nutritional source, or for the more Mosaic palette, liver (food) provides a more solid relative. I dare say even in the wild west they must have had liver now and then when they shot something. Wnt (talk) 05:46, 3 July 2014 (UTC)
As strange as it may seem, placing a piece of iron in cooking vessels does actually help fight iron deficiency. See this article about an iron fish used in Cambodia for that purpose [9]. --Xuxl (talk) 08:22, 3 July 2014 (UTC)
- Yeah, cast-iron cookware has been known for some time to contribute nutritionally significant quantities of iron to food, which is why people thought to create the lucky iron fish to provide the same benefit. Red Act (talk) 08:52, 3 July 2014 (UTC)
Why the hell haven't bacteria evolved to eat plastic by now anyway?
[edit]The question above about plastic degradation and plastic particle water pollution by microplastics reminds me of an old mystery: why haven't the bacteria evolved to eat this stuff? I mean, polyethylene is basically fatty acid, minus the ends. And living organisms routinely chow down fatty acids two carbons at a time (beta oxidation) for variable distances; there's no special mechanism for just a certain length (well, OK, I'm sure there is if you look hard enough but it's not the main metabolic pathway). Discarded in every possible environment, all the necessary nutrients are close at hand to at least some of the pieces of plastic. I've been expecting to see news reports of bacteria dissolving plastic in landfills for decades now, and it dawns on me suddenly that I'm still waiting. Why? Wnt (talk) 22:40, 2 July 2014 (UTC)
- You may be interested in Nylon-eating bacteria, although it's actually byproducts of nylon manufacture that that bacterium eats. Red Act (talk) 22:50, 2 July 2014 (UTC)
- You are probably more interested in [10]. EllenCT (talk) 22:54, 2 July 2014 (UTC)
- There are plenty of search examples for "plastic eating bacteria", many referencing: Nature —71.20.250.51 (talk) 23:02, 2 July 2014 (UTC)
- That "minus the ends" bit is pretty critical. Polyethylene in the form of plastic isn't water-soluble, and beta oxidation, like almost every chemical reaction, takes place in solution. In order to digest it, a bacterium would first need to evolve a way to dissolve it. --Carnildo (talk) 00:11, 3 July 2014 (UTC)
- Hmmmm... there's some merit to this, although I'd think that in some sense the plastic is already dissolved in plasticizer, I'm getting the impression that truly dissolving polyethylene requires elevated temperatures. [11] Though I know that polyethylene terephthalate (bottle cap) doesn't need to be all that warm to dissolve into olive oil (it's amazing what I've had to resort to during power outages). Bacteria clearly can form biofilm on fat globules and somehow use them for energy; but the devil may indeed be in the details. To begin with I should admit that upon RTFA I note that beta oxidation of lipids over 22 carbons actually has to take place in peroxisomes for some reason (which may be a clue?). But also, there's the question of whether a fatty acid has to get into the cell before beta-oxidation in order to be used for energy. It doesn't seem like that ought to be an inflexible requirement, but it is hard to picture how to separate the cycle from the external environment otherwise. And for all the marvels of bacterial design I have to admit I've never read about one with teeth to bite off and chew little bits of plastic. Wnt (talk) 05:24, 3 July 2014 (UTC)
- Maybe they already have, and we just don't know it yet? There are a lot of landfills, and prospecting is slow and tedious. Granted, this is "just" a science fair project, and I haven't looked yet for follow up since 2008, but this kid thinks he's found a bacteria (from a landfill) that can speed plastic degradation [12] [13]. Also, recall that it took fungi a long time to figure out how to eat dead wood, that's why we had the carboniferous era. So like you, I've expected to hear more about this by now, but I also think such a discovery and serious scientific investigation will take place long after bacteria have started munching our plastic bags. SemanticMantis (talk) 16:37, 3 July 2014 (UTC)
- That is indeed heartening to read, and I think that it's great that a teenager saw this and saw fit to carry out the work. I don't think that reducing the weight of the plastic by 43% in three months is the limit, though. Apparently there are Pseudomonas and Sphingomonas involved, but now some pros need to get involved, repeating this in many landfills, tracking down the genes involved, and assembling them all in a few organisms. Or maybe some other kids around the world will just repeat the experiment and swap samples on the internet relying on old-fashioned bacterial conjugation. :) In any case, progress! Wnt (talk) 02:36, 4 July 2014 (UTC)
- There are infact some Fungi capable of decomposing Plastic. Far as i know they are the decomposers par excellence in nature. --Kharon (talk) 21:01, 3 July 2014 (UTC)
- Thanks, I hadn't heard of this. Some brief news blurbs here [14] [15]. SemanticMantis (talk) 15:09, 4 July 2014 (UTC)
- There are infact some Fungi capable of decomposing Plastic. Far as i know they are the decomposers par excellence in nature. --Kharon (talk) 21:01, 3 July 2014 (UTC)
- That is indeed heartening to read, and I think that it's great that a teenager saw this and saw fit to carry out the work. I don't think that reducing the weight of the plastic by 43% in three months is the limit, though. Apparently there are Pseudomonas and Sphingomonas involved, but now some pros need to get involved, repeating this in many landfills, tracking down the genes involved, and assembling them all in a few organisms. Or maybe some other kids around the world will just repeat the experiment and swap samples on the internet relying on old-fashioned bacterial conjugation. :) In any case, progress! Wnt (talk) 02:36, 4 July 2014 (UTC)
A related question: Why are there apparently no bacteria that eat amber? A few tree resins still seem to be indigestible to everyone. Icek (talk) 15:48, 5 July 2014 (UTC)