Wikipedia:Reference desk/Archives/Science/2010 January 17
Science desk | ||
---|---|---|
< January 16 | << Dec | January | Feb >> | January 18 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
January 17
[edit]Any limit to bandwidth?
[edit]Could everyone in the world have as much wireless or radio bandwidth as they want, or is there some theorectical limit? And how near are we to it currently in urban areas? 78.147.229.52 (talk) 01:23, 17 January 2010 (UTC)
- The bandwidth of over-the-air broadcasts depends on a lot of things, but in general the carrier frequency has a lot to do with how much information you can fit into a single "channel" or radio frequency. Bandwidth (signal processing) has a technical explantion of how it works. --Jayron32 01:42, 17 January 2010 (UTC)
- The trouble is that more bandwidth generally requires a higher carrier frequency (for practical circuit design reasons). Unfortunately, high frequencies don't transmit as well over long distances and through walls. There's a really awesome band, down in the hundreds of kilohertz to few megahertz, which propagates so well that long wave radio can be transmitted across oceans and continents. But, there is no way to modulate megabits of data on a kilohertz carrier signal. The next window of opportunity is the VHF and UHF spectrum - which had previously been set aside especially for television, because it really goes pretty far (tens or hundreds of miles) and also goes straight through walls (so you can watch TV indoors, which is good for practical reasons). That spectrum has recently been freed up by government/commercial administrative action, and is now being reallocated for mobile telephones and mobile data systems. But that shared channel still only has a few tens or hundreds of megabits of total digital data, which must be shared between everyone on the air. The next higher frequency of practical significance is probably L-band through S-band, which you use for everything from satellite TV/internet to WiFi ("802.11" style wireless data). The satellite spectrum is very carefully allocated, because only a few select frequencies propagate well through the atmosphere to space. At these higher frequencies, channels can be modulated with bandwidths of tens or hundreds of megabits - but the propagation parameters are beginning to get ugly. "802.11" (or rather, 2.4 GHz) doesn't go through walls very well - and it doesn't really travel that nicely through air (not compared to, say, 700 MHz). So, city-wide WWANs will probably never use "802.11"-protocols, even if the power is boosted up really high - because the signal just fades quickly. As we get into the K-band and higher frequencies, it becomes prohibitively expensive to mass-produce circuits - so although the millimeter-wave and microwave bands are very appealing for their transmission and data capacity, they are at least a few decades off from mainstream wireless data systems. Finally, as we start peaking above terahertz radio and into the optical regime, the transmission becomes line-of-sight. So, while optical data transfer is great for bandwidth, it's terrible for wireless - unless you can physically see the laser or "transceiver" with your antenna, you won't be able to get data from it. This means no walls, no trees, no rain ... all these sorts of obstacles which radio frequency wireless systems are fairly immune to. Nimur (talk) 02:59, 17 January 2010 (UTC)
How close to saturation in urban areas are we currently? 10%, 20%? 92.29.80.215 (talk) 12:32, 17 January 2010 (UTC)
- It's a matter of bandwidth - but also of range. Everyone can have as many bluetooth connections as they like because the range of bluetooth radio is just a few meters. But if everyone was using the AM radio band and trying to transmit halfway around the world - things would get chaotic very quickly! But in the end, from around 100 kHz up to maybe 30GHz is usable. The total number of bits you can transmit over that waveband is half of the maximum frequency - so at most 15Gbits/second is available. That's quite a lot - but I have 5 computers in my house with 1GHz ethernet strung between them - if I switched to some futuristic radio technology, then I could imagine that I might personally consume a third of the world's radio frequency spectrum! Or, for example, a 3G phone can send about 6Mbits/sec and receive at about 14Mbits/sec - so at 20Mbits per phone, we're only able to support about 1000 phones before we run out of radio bandwidth! Clearly there are a lot more than 1000 3G phones in the world. So certainly if we allowed free use of any frequency range over any distance, the answer to your question would be a very clear "No!!".
- But it's not that simple. Each cellphone can only transmit and receive over a relatively short range - far enough to get to the nearest cellphone tower - but not much further. So we could actually (theoretically) support 1000 phones within the range of each tower - and those can be as close as 1km apart and allow everyone to use 20Mbits/s of bandwidth at the same time. But I'm sure that if you picked someplace like a city center or a sports stadium - there would be VASTLY more than 1000 cell phones within range of a single tower. Fortunately, not all 1000 phones are in use at the same time - and even when they are, not all of them are downloading movies from the Internet - most of them are sending about 10kbits/s of speech data in a phone call.
- So by taking advantage of the usage patterns and limited range of each kind of device, we can make much better use of the available radio spectrum.
- But still, I don't believe there is enough radio bandwidth to give everyone everything they need with the present technologies that we're using. You could imagine something much better though. Suppose we made teeny-tiny nanotechnology radio transmitters and receivers that were so cheap that we could put them into every surface of every building, every road, every car - and drop them from aircraft in vast quantities over open spaces of all kinds. We could imagine a situation where you were never more than (say) 30 meters from the nearest one. All our radio devices could be designed to have a maximum range of (say) 30 meters. Let's have those nanotech devices talk to each other - so when you send a message, the nearest one figures out the route to the destination of your message - using a chain of thousands or tens of thousands of these gadgets - each one transmitting at such low power that it can only reach the next one in the chain, typically about 10 to 20 meters away. This is like the present cellphone system - except instead of the towers being 1km to 30km apart - they are 1m to 30m apart. With such short ranges, you'd have only at most a dozen or so people within range of each device. Now, each person could use up to maybe 1/10th of the entire radio spectrum - more if not everyone is using it all the time or we aren't all using it at full bandwidth all the time. We could have a personal 1.5 Gbit/sec always-on connection.
- We could go one step further and do what the "One Laptop Per Child" system does. In that system, you don't send messages from one unit to "the network" directly - you have every unit acting as a "cell tower" - so my laptop talks to the WiFi system - but your laptop is out of range of the WiFi system - so it talks to my laptop and my laptop passes the message along to the WiFi. A network of these laptops can cover a large area - so long as there is always a chain of them reaching back to the Internet.
- If we made cellphones do that then the system would be magically self-scaling. Each phone would have to try transmitting at very low power to see if there is another phone that's in contact with "the network" and very close by. If not, it would gradually ramp up it's transmit power until it could find either another connected phone - or a cell tower. So long as all phones have plenty of bandwidth - not only for their own user - but also for passing on messages from other phones - then radio ranges could be kept short. This would save battery power too. We could imagine our sports stadium with half a dozen wired super-high bandwidth landlines - and 50,000 people all using a ton of bandwidth at the same time - but each one is transmitting only to the cellphone of the guy in the seat next to them...who transmits the data on to the next guy and so on. The amount of simultaneous bandwidth would be phenomenal because it's very efficiently shared. On the other hand, if you were out in the desert, 20 miles from civilisation, your cellphone would gradually increase it's power until it could reach the nearest cell tower. By transmitting with that much power, it's monopolising the bandwidth over hundreds of square miles - but that's OK because nobody else is using it.
- But even that isn't really the upper limit. We're still talking about systems that broadcast their radio in all directions. You could imagine a system where the phone can send highly directional radio beams and aim them at the nearest contact point. That would open up the number of phones that could share the same radio frequencies tremendously.
- There are some severe technological issues with what I've described. The latency in transmitting though all of those intermediate devices would be huge (although you could still have less closely spaced gizmo's connected with optical fibre or something to handle the longer distance traffic). The power requirements would be tough to handle - you couldn't have all of these many devices be battery operated - who would change the batteries? They'd need to be solar powered or something.
- But I think that with enough clever technology, we could come close to giving everyone up to maybe 5Gbits/sec of wireless bandwidth. More than that starts to become very difficult.
- The total bandwidth of a mesh network is limited by the connection from the mesh to the internet. If you have a long chain of users, with just one connected to the internet, then they are still sharing one connection between everyone (this isn't usually an issue because most people won't be using lots of bandwidth at the same time, but that same thing means that sharing the radio spectrum isn't a problem yet). Shortening the range of radio transmissions is a good idea, but you need to shorten the total distance of the radio transmission, not the length of each step. Those mini-cell towers every 30m would each need some other method of connecting to the internet (that might be where directional antenna come in - getting a phone to transmit in the right direction is a challenge because it is always moving, getting a mini-cell tower to do so would be easy). --Tango (talk) 18:44, 17 January 2010 (UTC)
- Steve's suggestion sounds great (I'm a huge fan of wireless peer to peer routing schemes). The immense benefits that a short-range mesh network may yield are awesome for per-transaction transmit power. But Steve forgets to mention the dark and dirty secret about mesh networks - the aggregate overhead to power and bandwidth that is incurred, because every device must operate in an "always-on", always transceiving mode! The total power consumed may be much higher - e.g. much worse for battery life - because the device is constantly relaying other peoples' data. Whether this actually reduces the total power consumed by the entire network (e.g. whether the integral of the distributed, but constant, low-power draw is less than the integral of the short, high-power transmission for a centralized, point-to-point scheme), is an active area of research. It seems to be highly dependent on usage-case. A lot of commercial vendors are now producing microcell and "picocell" mobile telephony transceivers, stepping towards a more mesh-network style arrangement. This is especially useful in densely populated locales, like inside a shopping mall or one seating area of a a sports stadium. Finally, femtocells (you got to hand it to these marketing guys' creative capability...) allow a really mesh-style, single-room connectivity. I think in general we are moving toward a Optical carrier fibre ("wired") back-end with localized point-size wireless micro-transmitters, rather than a true, completely wireless mesh-style peer-to-peer scheme. But, as more devices become wireless-data-enabled, Steve's description of a true mesh may become more economic (both in terms of infrastructure cost and operational bandwidth/power/battery-life considerations). In the meantime, hobbyists, survivalists, military users, and HAMs make use of technology like AX.25, military Command, control, and communications radio-networks, and HAM packet radio. These are true wireless mesh schemes with no reliance on a wired backbone. Some of these networks are even connected to the internet and can serve as an underlying transport for TCP/IP networking as if you were on an 802.11 system. I have "heard anecdotally" that EA-6B Prowler fighter jets overflying ground troops in Afghanistan now broadcast a point-to-point VHF signal to the ground; they dynamically locate the troops' radio transceivers and mesh up a wireless network for the few seconds that they are within visual range, and perform bidirectional data dumps to the base stations using aviation bands, when the troops are out of range of their ground-based VHF and UHF systems. Nimur (talk) 20:21, 17 January 2010 (UTC)
- Indeed, mesh is fantastic for increasing coverage (in a warzone, in the poverty stricken areas targeted by OLPC, etc.), but it does nothing to increase bandwidth (it might allow for more efficient usage of bandwidth in some circumstances, I suppose), which is the topic of this thread. --Tango (talk) 20:41, 17 January 2010 (UTC)
- Right. The current status-quo is that the mesh network is a sub-network, and it connects to the internet. In the limiting case, the entire internet is a mesh network, and there is no point bottleneck for the bandwidth. It seems unlikely that we are evolving in this direction, as most of the content that people now access on the internet comes from a centralized server, with a disproportionate amount of bandwidth between "everybody" and one or two companies (e.g. Google/Youtube, for example). Theoretically, an optimal use of bandwidth would require everybody to have a random probability of accessing content from anybody else - but this is not really economically effective for the internet. Conglomerates operate central servers, and most people downlink data from those, in a mostly "uni-directional" mode. Even accounting for user-generated content, the total quantity of data transferred is usually very disproprotionate and centralized when it is hosted on third-party servers. Nimur (talk) 22:16, 17 January 2010 (UTC)
- Indeed, mesh is fantastic for increasing coverage (in a warzone, in the poverty stricken areas targeted by OLPC, etc.), but it does nothing to increase bandwidth (it might allow for more efficient usage of bandwidth in some circumstances, I suppose), which is the topic of this thread. --Tango (talk) 20:41, 17 January 2010 (UTC)
- Steve's suggestion sounds great (I'm a huge fan of wireless peer to peer routing schemes). The immense benefits that a short-range mesh network may yield are awesome for per-transaction transmit power. But Steve forgets to mention the dark and dirty secret about mesh networks - the aggregate overhead to power and bandwidth that is incurred, because every device must operate in an "always-on", always transceiving mode! The total power consumed may be much higher - e.g. much worse for battery life - because the device is constantly relaying other peoples' data. Whether this actually reduces the total power consumed by the entire network (e.g. whether the integral of the distributed, but constant, low-power draw is less than the integral of the short, high-power transmission for a centralized, point-to-point scheme), is an active area of research. It seems to be highly dependent on usage-case. A lot of commercial vendors are now producing microcell and "picocell" mobile telephony transceivers, stepping towards a more mesh-network style arrangement. This is especially useful in densely populated locales, like inside a shopping mall or one seating area of a a sports stadium. Finally, femtocells (you got to hand it to these marketing guys' creative capability...) allow a really mesh-style, single-room connectivity. I think in general we are moving toward a Optical carrier fibre ("wired") back-end with localized point-size wireless micro-transmitters, rather than a true, completely wireless mesh-style peer-to-peer scheme. But, as more devices become wireless-data-enabled, Steve's description of a true mesh may become more economic (both in terms of infrastructure cost and operational bandwidth/power/battery-life considerations). In the meantime, hobbyists, survivalists, military users, and HAMs make use of technology like AX.25, military Command, control, and communications radio-networks, and HAM packet radio. These are true wireless mesh schemes with no reliance on a wired backbone. Some of these networks are even connected to the internet and can serve as an underlying transport for TCP/IP networking as if you were on an 802.11 system. I have "heard anecdotally" that EA-6B Prowler fighter jets overflying ground troops in Afghanistan now broadcast a point-to-point VHF signal to the ground; they dynamically locate the troops' radio transceivers and mesh up a wireless network for the few seconds that they are within visual range, and perform bidirectional data dumps to the base stations using aviation bands, when the troops are out of range of their ground-based VHF and UHF systems. Nimur (talk) 20:21, 17 January 2010 (UTC)
- Speaking of WiFi, it appears that some people may be "allergic" to it. [1] ~AH1(TCU) 23:38, 17 January 2010 (UTC)
- As far as I know, there is no reliable evidence of any such sensitivities. --Tango (talk) 00:11, 18 January 2010 (UTC)
- Electrosensitivity? ~AH1(TCU) 00:33, 18 January 2010 (UTC)
- What's the question mark for? --Tango (talk) 01:03, 18 January 2010 (UTC)
- As our article says, "The majority of provocation trials to date have found that self-described sufferers of electromagnetic hypersensitivity are unable to distinguish between exposure to real and sham electromagnetic fields". Was there a particular reason to bring it up here? 86.178.229.168 (talk) 01:50, 18 January 2010 (UTC)
- Electrosensitivity? ~AH1(TCU) 00:33, 18 January 2010 (UTC)
- As far as I know, there is no reliable evidence of any such sensitivities. --Tango (talk) 00:11, 18 January 2010 (UTC)
- The total bandwidth of a mesh network is limited by the connection from the mesh to the internet. If you have a long chain of users, with just one connected to the internet, then they are still sharing one connection between everyone (this isn't usually an issue because most people won't be using lots of bandwidth at the same time, but that same thing means that sharing the radio spectrum isn't a problem yet). Shortening the range of radio transmissions is a good idea, but you need to shorten the total distance of the radio transmission, not the length of each step. Those mini-cell towers every 30m would each need some other method of connecting to the internet (that might be where directional antenna come in - getting a phone to transmit in the right direction is a challenge because it is always moving, getting a mini-cell tower to do so would be easy). --Tango (talk) 18:44, 17 January 2010 (UTC)
Going back to the original question, and talking a bit more about the theoretical limits to the information bandwidth (i.e., number of bits of information that can be communicated per second). As already mentioned, the limit depends upon the channel bandwidth (measured in Hertz) that is available, but it also depends upon the channel signal to noise ratio. For instance, for point-to-point communication in an additive white Gaussian noise (AWGN) channel, such as wired communication or short-range/open field wireless communication, this limit, known as the channel capacity C is given by the famous Shannon–Hartley theorem:
where B is the bandwidth (in Hz), S is the signal strength (i.e, how much power in Watts you have available to transmit the signal) and N is the level of noise in the channel. From this formula you can see, that one can (theoretically) transmit unbounded amount of information over even a narrow channel, as long as one is ready to use more powerful transmitter.
The analysis becomes much more complex when one considers wireless channels in urban environments (for example due to multipath fading), and multiuser systems (in which case one needs to account for signal interference, spatial diversity etc, and differentiate between overall throughput, average throughput etc), but the general intuition remains the same - in theory, one can arbitrarily increase the information rate, by either using wider bandwidth channels, or by throwing more power in (although the marginal returns of the latter keep getting smaller). Any standard textbook on wireless communications should be able to provide you more information on the basics of this topic, although the general multiuser information theory is still an open problem, and an active area of research. Abecedare (talk) 01:55, 18 January 2010 (UTC)
Could a religious person please specify the maximum available bandwidth of the prayer channel to God? Cuddlyable3 (talk) 15:25, 18 January 2010 (UTC)
That limit is precisely what current technology can do. If we knew a precise limit then we'd also have the first step towards surpassing it. Vranak (talk) 20:08, 21 January 2010 (UTC)
Red dwarfs proportion to all stars
[edit]According to the best current estimates (the current consensus), what percentage (range of percentage) of the stars in the universe are red dwarfs? —Preceding unsigned comment added by 63.17.79.94 (talk) 11:58, 17 January 2010 (UTC)
Old school cathode ray television superior?
[edit]I have a monstrous standard definition Phillips 32PW9586/05 widescreen CRT television. Is it just my imagination or sour grapes at not having a modern flat HDTV plasma or LCD television, or is there something much more film like about "old fashioned" sets? The plasma and LCD televisions I have seen in shops seem either dull or cartoon-like in comparison, despite the much vaunted extra detail. Does anyone make High Definition CRT televisions? Perhaps it's merely sentimentality over an "antique?"Trevor Loughlin (talk) 13:16, 17 January 2010 (UTC)
- I also prefer CRTs. My desktop uses dual CRT monitors. They have traditionally provided a higher color depth. For example, some LCDs have trouble displaying pure black. They also have a faster response time, making motion look more lifelike. They also have a wider field of view. However, the latest high-end LCD screens have almost caught up to CRTs in color depth, and response rates. At least, our eyes can barely tell the difference. But cheaper LCDs are certainly still far inferior to cheaper (and higher end) CRTs. Further, LCDs still lag significantly in viewing angle. You will notice a difference at an angle. Some LCD screesn are unviewable at even slight angles, whereas others are wider. If you want an LCD TV that looks like your old CRT, you will have to pay much more than what you paid for the CRT. Unfortunately, I know of no high-end CRTs that are being manufactured. People like LCDs because they are more stylish, easier to carry, brighter, and consume less energy. I can understand why you'd be concerned about weight if the screen was for a laptop, but you only have to move a TV or monitor once. So, I think LCDs are over-rated.--Drknkn (talk) 13:39, 17 January 2010 (UTC)
- I also view a 32-inch widescreen CRT TV (Panasonic) and relative to a plasma or LCD replacement I judge that I have better brightness and contrast and much better sound thanks to the monstrous cabinet size, which hardly matters when the TV stands in the corner of the room. However my set cannot match the "perfect" picture linearity, colour convergence and low power consumption of flat panel displays. The latest pricey Organic LED panel TVs may give the best displays yet, and they are
alreadyexpected to become available in larger sizes than are practical for CRTs. Cuddlyable3 (talk) 20:03, 17 January 2010 (UTC)- One feature of CRTs that I am still waiting for in LCD/Plasma/OLED/DLP is a true black. A CRT could drive the electron beam to essentially zero amplitude, so if I had a little text on a black background, the tube would not light up a dark room. An LCD has a backlight, so the best it can do is dim the screen to a "gray" or "purple" color. All the progress in contrast ratios for LEDs and their ilk seems to be about making bright whites even brighter - but I'd really like to see a darker black the way a CRT can operate. Nimur (talk) 22:21, 17 January 2010 (UTC)
- I have checked this only on older CRT TVs: there is long-term faint persistence of the image that is visible in a dark room to dark-accustomed eyes. Cuddlyable3 (talk) 12:52, 20 January 2010 (UTC)
- One feature of CRTs that I am still waiting for in LCD/Plasma/OLED/DLP is a true black. A CRT could drive the electron beam to essentially zero amplitude, so if I had a little text on a black background, the tube would not light up a dark room. An LCD has a backlight, so the best it can do is dim the screen to a "gray" or "purple" color. All the progress in contrast ratios for LEDs and their ilk seems to be about making bright whites even brighter - but I'd really like to see a darker black the way a CRT can operate. Nimur (talk) 22:21, 17 January 2010 (UTC)
- I also view a 32-inch widescreen CRT TV (Panasonic) and relative to a plasma or LCD replacement I judge that I have better brightness and contrast and much better sound thanks to the monstrous cabinet size, which hardly matters when the TV stands in the corner of the room. However my set cannot match the "perfect" picture linearity, colour convergence and low power consumption of flat panel displays. The latest pricey Organic LED panel TVs may give the best displays yet, and they are
- Well as our article mentions, OLED TVs don't use a backlight so do or should achieve a good black. I haven't actually tested it myself but you can always buy a Zune HD or something if you want to see. Or one of the displays linked below. If it doesn't do it for you, feel free to send it to me so I can see for myself... Nil Einne (talk) 10:07, 18 January 2010 (UTC)
- What? The largest commercial OLED TV (or display of any kind) I'm aware of is a the LG 15-inch [2]. Indeed the 11 inch Sony XEL-1 is one of the only other relatively commercial OLED TVs (or any other display) I'm aware of. Sure Samsung have demonstrated a 40 inch OLED TV and other companies have demonstrated other large OLED displays and sure people keep promosing large OLED displays and perhaps we'll actually get them this year but I wouldn't exactly consider them 'available' when they've just been shown of in tech shows but not sold in the marketplace. Nil Einne (talk) 10:05, 18 January 2010 (UTC)
- Philips sell a 40", LG sell 42" and 47", Samsung sell 40" and 46" LED-backlit LCD TVs. Corrected. Cuddlyable3 (talk) 12:48, 20 January 2010 (UTC)
- Yes but they aren't OLED which is what you mentioned/linked to. And a LCD TV with an LED backlight is quite different tech from an OLED TV so it doesn't make much sense to discuss them as the same thing even if some vendors call them LED TVs. Personally I admit I find OLED or well any real LED displays i.e. displays which use LED for the display rather then just the backlighting much more interesting then the so called other LED displays although I won't likely be getting either anytime soon.
- And in case you thought those were OLED TVs, well you weren't the only one. I only realised last month they didn't exist in large sizes despite all the hype and confusing marketing when taking part in some silly Christmas wish list competition (which I suspect they didn't really care about the list but had it so it wouldn't be considered gambling). I decided to put a bunch of crazy stuff including of course a Bugatti Veyron; and a large OLED TV intending to put an expensive one (brand and model number) until I realise they didn't exist so ended up just putting a large size (52" I think) and mentioning they didn't exist yet.
- Nil Einne (talk) 16:10, 18 January 2010 (UTC)
- Right - I was much disappointed with all the hype. In the US, it is legal to advertise such devices as "LED TVs" even though they are actually LED backlights for regular LCD screens. I believe that such advertising tactics are not permitted in the UK, which more strictly regulates the terminology that vendors can use. LED-backlit LCD television is the relevant article for more details. Nimur (talk) 18:44, 18 January 2010 (UTC)
- Hmm our article says "In terms of the use of the term 'LED TV' in the UK, the ASA (Advertising Standards Authority) has made it clear in prior correspondence that it does not object to the use of the term, but does require it to be clarified in any advertising", in other words you can call them LED TVs provided it's clear you're referring to an LCD TV with LED backlighting or edgelighting. It would seem Samsung was found to have misleading advertising [3] [4]. I've seen some ads before in NZ, but never really looked carefully I would presume they've now adopted the same policy since our advertising standards tend to be fairly similar. The ruling doesn't of course specifically say what sort of clarification is required and how it must be shown. I would guess there's quite a few cases histories however, not specifically referring to TVs but with other stuff. For example while using some form small print may be okay, I doubt they could get away with putting a large ad and then clarifying in font which is unreadable from a large distance. This ad which from looks like it's from after the ruling is I guess an example of what Samsung thinks complies [5] Nil Einne (talk) 08:41, 2 February 2010 (UTC)
- Right - I was much disappointed with all the hype. In the US, it is legal to advertise such devices as "LED TVs" even though they are actually LED backlights for regular LCD screens. I believe that such advertising tactics are not permitted in the UK, which more strictly regulates the terminology that vendors can use. LED-backlit LCD television is the relevant article for more details. Nimur (talk) 18:44, 18 January 2010 (UTC)
Polarization of radio waves for extra signal capacity?
[edit]Light can be polarized, and it is possible to imagine a line of sight communication device using a laser somehow exploiting not only bandwidth but somehow modulating polarization as well to increase the data carrying rate (has this ever been done?- I just thought it up), but could this also be applied in more crowded areas of the EM spectrum, such as the microwave or even radio bandwidths? —Preceding unsigned comment added by Trevor Loughlin (talk • contribs) 13:32, 17 January 2010 (UTC)
- Have a look at this: [6]. --VanBurenen (talk) 13:40, 17 January 2010 (UTC)
- Do you mean that link to 13 million search hits for "Sound focusing"? Cuddlyable3 (talk) 19:02, 17 January 2010 (UTC)
- Well, the first few would might be useful already. --VanBurenen (talk) 16:10, 18 January 2010 (UTC)
- It's possible in principle - but when a radio or light signal is reflected off of something, that messes up the polarization - so there would be quite a lot of cross-talk between channels. Certainly with a radio antenna, lining it up with the plane of polarization of the transmitter greatly enhances reception. SteveBaker (talk) 14:17, 17 January 2010 (UTC)
- Use of different polarizations horizontal and vertical increases the isolation between TV transmissions which allows more efficient sharing of channel frequencies since less distance is needed between co-channel transmitters. When TV broadcasting in Britain was at VHF one could see yagi receiver antennas on many house roofs, mounted according to whichever was the local polarization. Cuddlyable3 (talk) 19:44, 17 January 2010 (UTC)
- This is certainly used on satellite TV and transponders to increase the number of channels available. In this application there are no obstacles affecting the signal from transmitter to receiving dish, so no mix up occurs. In MIMO this could well be used to double transmission throughput. The MIMO system does not care what the polarisation is, only that there is a difference in the signal between the two pairs of antennas. Graeme Bartlett (talk) 21:54, 17 January 2010 (UTC)
- Use of different polarizations horizontal and vertical increases the isolation between TV transmissions which allows more efficient sharing of channel frequencies since less distance is needed between co-channel transmitters. When TV broadcasting in Britain was at VHF one could see yagi receiver antennas on many house roofs, mounted according to whichever was the local polarization. Cuddlyable3 (talk) 19:44, 17 January 2010 (UTC)
- It's possible in principle - but when a radio or light signal is reflected off of something, that messes up the polarization - so there would be quite a lot of cross-talk between channels. Certainly with a radio antenna, lining it up with the plane of polarization of the transmitter greatly enhances reception. SteveBaker (talk) 14:17, 17 January 2010 (UTC)
- As an aside, I'll note that in Carl Sagan's novel Contact part of the information in the alien message was carried through polarization modulation of the radio signal. (I think this was mentioned in the film as well, but I'm not certain.) TenOfAllTrades(talk) 22:18, 17 January 2010 (UTC)
How warm should the water used with yeast in bread machines be?
[edit]I use a bread machine, it takes nearly four hours to make a loaf. This includes waiting for the dough to rise. Unfortunately the end result is not as fluffy as shop-bought bread. (Yes, I do use strong flour). What is the best temperature for the water I put in the machine? If the water is too cool, the dried Baker's yeast will be underactive. If it is too hot, it will be detrimental to the yeast also. What temperature would the yeast like best? And is there any evidence that it would be worth adding some vinegar to the water - would the yeast prefer it to be slightly acid? 92.29.80.215 (talk) 14:46, 17 January 2010 (UTC)
- Around 35 to 40 degrees will do fine. I don't know where you live, but typically bread from a read machine will be of the more chewy type, not like US toast. You can try a variety of recipes, though. --Stephan Schulz (talk) 15:30, 17 January 2010 (UTC)
- (EC with below) I personally don't believe the water temperature matters too much provided you don't let it be too hot (i.e. above 40 degrees C). AFAIK, most bread makers are designed to heat up the mixture during the rise cycle to an appropriate temperature so at best it's only going to make a minor improvement. Also while I don't use a bread maker much, I've had some experience with bread maker loaves and buns that are fairly fluffy. I agree with Stephan Schulz you should try a variety of recipes and varying your recipes. Also make sure you are using new, high quality yeast and following the recipe instructions particularly about the order and measuring your ingredients properly (perhaps using weight where possible to avoid issues like the flour settling). If your yeast doesn't already include it, make sure you use bread improver. In fact even if it does you may want to try adding more too and perhaps some more gluten. If you still can't achieve something you like, perhaps trying using the bread maker for kneading and rising the dough (most should have some sort of dough function) and then shaping buns or using a loaf tin and allowing the bread to rise again before cooking in an oven (presuming you have one). It will take longer and obviously be more effort but may achieve what you want and will still be easier then e.g. kneading by hand. In between perhaps you could try doing a double rise in the bread maker, i.e. put it on the dough setting and then when it's done put it on the normal bread setting so it'll knead and then go thorough a rise cycle again before cooking (I don't actually know if this will work well, this is not something I've heard of from anywhere). P.S. You may also want to consider stuff like a brioche loaf or other things which even if not something you'll expect from a store, may be an interesting and pleasant variety Nil Einne (talk) 16:13, 17 January 2010 (UTC)
- If the temperature is truely unimportant then I would use water straight from the tap which at this time of year is rather cold. 92.29.80.215 (talk) 16:57, 17 January 2010 (UTC)
- (EC with below) I personally don't believe the water temperature matters too much provided you don't let it be too hot (i.e. above 40 degrees C). AFAIK, most bread makers are designed to heat up the mixture during the rise cycle to an appropriate temperature so at best it's only going to make a minor improvement. Also while I don't use a bread maker much, I've had some experience with bread maker loaves and buns that are fairly fluffy. I agree with Stephan Schulz you should try a variety of recipes and varying your recipes. Also make sure you are using new, high quality yeast and following the recipe instructions particularly about the order and measuring your ingredients properly (perhaps using weight where possible to avoid issues like the flour settling). If your yeast doesn't already include it, make sure you use bread improver. In fact even if it does you may want to try adding more too and perhaps some more gluten. If you still can't achieve something you like, perhaps trying using the bread maker for kneading and rising the dough (most should have some sort of dough function) and then shaping buns or using a loaf tin and allowing the bread to rise again before cooking in an oven (presuming you have one). It will take longer and obviously be more effort but may achieve what you want and will still be easier then e.g. kneading by hand. In between perhaps you could try doing a double rise in the bread maker, i.e. put it on the dough setting and then when it's done put it on the normal bread setting so it'll knead and then go thorough a rise cycle again before cooking (I don't actually know if this will work well, this is not something I've heard of from anywhere). P.S. You may also want to consider stuff like a brioche loaf or other things which even if not something you'll expect from a store, may be an interesting and pleasant variety Nil Einne (talk) 16:13, 17 January 2010 (UTC)
- The reason shop-bought bread is so fluffy is it's steam-baked. You can approximate this by putting a dish of boiling water at the bottom of the oven you're baking the bread in. However, obviously you can't do this if you have a breadmaker. It's the reason I gave up making bread in a breadmaker - I really don't like the texture of the stuff that comes out. --TammyMoet (talk) 15:58, 17 January 2010 (UTC)
- It's not clear to me what "shop-bought bread" is supposed to mean here -- a Parisian baguette, or Wonder Bread? If you're talking about high-quality bread, there is more to it than yeast and steam. To get really high-quality bread, you need the right flour, and you need to start with a sponge and let the bread rise slowly for hours, rather than a simple yeast rise. Looie496 (talk) 17:00, 17 January 2010 (UTC)
- I've used it to mean both. The Mrs Beeton recipe for Parisian baguette I use entails the dish of boiling water, and I toured a RHM factory some years ago in which the process was explained in detail. --TammyMoet (talk) 19:10, 17 January 2010 (UTC)
- It's not clear to me what "shop-bought bread" is supposed to mean here -- a Parisian baguette, or Wonder Bread? If you're talking about high-quality bread, there is more to it than yeast and steam. To get really high-quality bread, you need the right flour, and you need to start with a sponge and let the bread rise slowly for hours, rather than a simple yeast rise. Looie496 (talk) 17:00, 17 January 2010 (UTC)
Any shop-bought bread I can remember buying. I'm now eating some bread I've (er, I mean the machine) just made. The reason why it is not so fluffy may be that the inside is too damp or somewhat soggy. Perhaps I should use less water or leave it in the machine for the further hour of warming that it does automatically. What do you mean by 'sponge' please? 92.29.80.215 (talk) 17:22, 17 January 2010 (UTC)
- The sponge method of making bread involves mixing a small amount of yeast (perhaps 1/4 teaspoon) with flour and water, and giving it about 24 hours to form a sponge. The sponge is then mixed with the rest of the ingredients, and allowed to rise. This will take longer than the method often used by home cooks (starting with a large amount of yeast). One sees claims that this method gives better flavor, but I only tried it once and didn't notice a difference. --Jc3s5h (talk) 19:28, 17 January 2010 (UTC)
You are fortunate in the USA not to have the Chorleywood Bread Process that is almost universal in UK shop bread, and gives it all the same "pasty" taste, whatever the ingredients! Dbfirs 21:03, 17 January 2010 (UTC)
Just feel the tapwater with your hand and when it seems like the right temperature for the yeast, you're good to go. Use your intuition, not a thermometer. You don't want to become a slave to gadgets. Vranak (talk) 20:07, 21 January 2010 (UTC)
Effects on mood: a) stuffy atmosphere, b) gloom or time of day
[edit]Due to the unusually cold snowy weather here recently I had all the windows in my house closed for many days, I stayed indoors nearly all the time, and it felt rather stuffy. After opening a window I soon felt a lift in mood. Could this be any more than just imaginary, such as the effects of negative ions? (Wikipedia does not seem to have an article on ionisation machines that were suppossed to be good for you by increasing the number of negative or was it positive ions in the air, like waterfalls). I understand that the sensation of stuffyness has nothing to do with the concentrations of oxygen or CO2, which do not change appreciably. Sunsets and twilights are commonly regarded as being mournful. People tend to dwell on the past during the evening or when waking up in the middle of the night. Dull days are dull, sunny days are joyful. Is there any physiological explaination for any of this? 92.29.80.215 (talk) 17:12, 17 January 2010 (UTC)
- The effects of light on mood are very well known; our circadian rhythms articles talks about them a bit. But regarding stuffiness, it's possible that a change in humidity, or simply a drop in temperature, have something to do with it. Looie496 (talk) 17:37, 17 January 2010 (UTC)
- I'm not sure why you think the oxygen concentration would not change. If the house was airtight you would soon notice a change in oxygen levels. Could oxygen starvation be relevant?--Shantavira|feed me 18:09, 17 January 2010 (UTC)
- Well there are some problems with a closed system like a house: everything in it leeches molecules that aren't necessarily bad for you but certainly aren't as good as outdoor air. Having just a little airflow can make a big difference over hours and days. Moreover your body is constantly shedding skin cells, aka dust, so even if you lived in a stone and wood building with no outgassing there would still be some benefit from having a little draught of cool air pass through occasionally, just to carry out a few specks of dust and bring oxygen levels back up to snuff. That said if you feel cold, the windows are open too wide! Vranak (talk) 18:20, 17 January 2010 (UTC)
- Pedantic correction: from context, you probably meant leaks, exudes or something similar - leeches or leaches would imply the opposite :-). 87.81.230.195 (talk) 19:03, 18 January 2010 (UTC)
- I see your point but you can use the verb 'leech' to describe outward flow. For instance, "The doctor leeched 3 quarts of blood from me with his Madagascar blacks." Vranak (talk) 20:04, 21 January 2010 (UTC)
- Pedantic correction: from context, you probably meant leaks, exudes or something similar - leeches or leaches would imply the opposite :-). 87.81.230.195 (talk) 19:03, 18 January 2010 (UTC)
I understand that there is still some considerable airflow in houses even with all doors and windows shut. My house cetainly has draughts. I read the circadian rhythm article and while it briefly mentioned bipolar disorder it did not say anything about mood for non-bipolar people. 92.29.80.215 (talk) 18:30, 17 January 2010 (UTC)
- Seasonal affective disorder may be of interest. ~AH1(TCU) 20:06, 17 January 2010 (UTC)
- IMHO, you should give yourself credit for having part of the answer correct at the time you asked the question!
- For Christmas, I received one of those little books of factoids (not exactly a glowing reference, I know), one of which is:
- If you were sealed up in an airtight space, you would
- die of carbon dioxide poisoning long before you would
- suffer the effects of oxygen deprivation.
- If you were sealed up in an airtight space, you would
- This makes it entirely plausible that replacing the "stuffy" air is far more important than bringing oxygen levels up to snuff.
- HTH, DaHorsesMouth (talk) 22:12, 17 January 2010 (UTC)
- Remember, however that your house is not airtight. There are plenty of cracks and leaks in your home that allow air to travel between the inside and outside. ~AH1(TCU) 23:35, 17 January 2010 (UTC)
- This is true but if the air seems stuffy perhaps those cracks aren't quite enough. Vranak (talk) 01:20, 18 January 2010 (UTC)
- Remember, however that your house is not airtight. There are plenty of cracks and leaks in your home that allow air to travel between the inside and outside. ~AH1(TCU) 23:35, 17 January 2010 (UTC)
liquid diamond
[edit]The Wikipedia main page today links to Diamond#Material properties and states that researchers succeeded in melting some diamond, and they believe that solid diamond would float on liquid diamond. But I thought the only attribute that distinguishes diamond from any other mass of carbon was the crystalline structure of the diamond, which, I would think, would be destroyed utterly when the diamond melts. One of the articles linked to by that section states that in previous attempts, the diamond had turned into graphite, which had then melted. But I would think the two resulting pools of liquid carbon molecules would be indistinguishable. What am I missing here? Comet Tuttle (talk) 18:47, 17 January 2010 (UTC)
- I am guessing, but I suspect that a pool of liquid carbon is called liquid diamond if it is near a phase transition such that it would naturally freeze into diamond. Under atmospheric pressure if you freeze liquid carbon you get graphite, but under high enough pressure the natural state becomes diamond and that seems like the likely difference. The two pools presumably aren't that different if observed on their own. As liquids, they still don't have structure, even though one would start out much more compressed due to the enormous pressure. Dragons flight (talk) 19:06, 17 January 2010 (UTC)
- The Discovery News article states "The trick for the scientists was to heat the diamond up while simultaneously stopping it from transforming into graphite." The original report is behind a subscription wall but one could ask the author email redacted how one can know a transformation to liquid graphite did not occur. Cuddlyable3 (talk) 19:25, 17 January 2010 (UTC)
- Just as we don't post OP's emails, we shouldn't post paper authors' email addresses. The corresponding author can be found in the linked Nature article, here. Nimur (talk) 20:09, 17 January 2010 (UTC)
- Isn't it a little silly to worry about protecting someone from spam when there email is already posted on major websites? Dragons flight (talk) 20:43, 17 January 2010 (UTC)
- I think it is perfectly ok to protect casual readers from thinking it was ok to post anyone's email address. 93.132.150.199 (talk) 14:09, 18 January 2010 (UTC)
- Isn't it a little silly to worry about protecting someone from spam when there email is already posted on major websites? Dragons flight (talk) 20:43, 17 January 2010 (UTC)
- So if the two pools are identical, our "In the News" blurb about it seems to have some incorrect information: "The first experimental measurement of the melting point of diamond (pictured) indicates that the solid form floats on the liquid form at ultrahigh pressures." We knew that before, because it's easy to melt graphite and test it in the same way. (Or, we "could have know that" without having to had to directly melt diamond.) Comet Tuttle (talk) 19:46, 17 January 2010 (UTC)
- I see they're struggling with it, too; the first version of the blurb (which prompted my question) was "The first experimental measurement of the melting point of diamond (pictured) indicates that the solid floats on the liquid." Comet Tuttle (talk) 19:48, 17 January 2010 (UTC)
- No, it's not the same. Liquid graphite has a density of ~1.2 g /cm3 at low pressures, much less than the 3.5 g / cm3 for diamond. The observation that at 10 Mbar is that liquid graphite has a density greater than diamond at the same pressure, which is a highly non-trivial finding and implies much greater compression in the liquid phase than in the solid. Dragons flight (talk) 20:43, 17 January 2010 (UTC)
- Carbon is a funny element. The sp hybridization phenomenon makes it possible for the liquid carbon atoms to be in a variety of hybrid states, each state producing different interactions with neighbor atoms and therefore liquid with different density for a given temperature T and pressure P. In parts of the (T,P) range one liquid state is stable, in other parts another. Boundaries between the stability domains are called coexistence curves, as for those (T,P) the stable state is a mixture of two different liquid states with different densities. It is entirely possible that one of the liquid phases will have diamond-like near order and bonding, and another one will have graphite-like near order and bonding. There have been papers published for quite a while about liquid-liquid phase transition(s) in carbon; see, e.g., Phys. Rev. Lett. 82, 4659–4662 (1999). It was tempting to put a reference to Tori Amos here, too, but I am afraid that her song is much harder to understand than the science involved... --Dr Dima (talk) 00:01, 18 January 2010 (UTC)
- So to rephrase my original question, how do the 2 pools of melted pure carbon differ? I had thought each pool would be a pool of loose carbon molecules with no bonds between them. Comet Tuttle (talk) 18:54, 18 January 2010 (UTC)
- There would be bonds in a liquid, what happens is that the energy of the atoms is great enough to stop them from being rigid, and they either break and reform, or move around. If there were no bonds at all, you would get gaseous monatomic carbon. However even in a gas you could expect C2 molecules. Graeme Bartlett (talk) 20:57, 18 January 2010 (UTC)
Coconut water
[edit]About how much coconut water is there in a coconut? Mac Davis (talk) 21:59, 17 January 2010 (UTC)
- 200 ml (personal experience). Dauto (talk) 23:15, 17 January 2010 (UTC)
- I would say it's more like 6 3/4 ozs. DRosenbach (Talk | Contribs) 02:20, 18 January 2010 (UTC)
- I would say it depends very greatly on the age of the coconut (among other factors) Nil Einne (talk) 09:45, 18 January 2010 (UTC)
- I would say it's more like 6 3/4 ozs. DRosenbach (Talk | Contribs) 02:20, 18 January 2010 (UTC)
why don't diamonds oxidise?
[edit]I get the idea that oxidation of solid diamond may be kinetically slow cuz of the crystal lattice stability. But ... shouldn't that make liquid diamond prone to oxidation, especially at such high temperatures and pressures? I mean, carbon dioxide can't form a protective layer, unlike zinc oxide or magnesium oxide, cuz the oxide is a gas. What happens when you wash diamonds with hydrogen peroxide? Isn't C=O bond stronger than 2 C-C bonds? 2 C=O bonds should be even stronger than 4 C-C bonds... John Riemann Soong (talk) 22:09, 17 January 2010 (UTC)
- I assume you are talking about the recent finding -- Nature Physics 6 pp 9-10 and 40-43 (2009/2010) -- of diamond-like liquid carbon state. The authors use Omega laser to send a really, really strong shockwave into a diamond sample, and then optically determine the temperature in the shocked material. From a peculiar behavior of the temperature the authors infer that a first-order phase transition takes place. The liquid phase turned out to be denser than the solid diamond phase (the melting curve has a "wrong" slope). This has nothing to do with oxidation. The whole experiment is extremely brief, involves no free surfaces exposed to air, and occurs at temperatures of 5000 - 50000 K and pressures of 500 - 4000 GPa. --Dr Dima (talk) 00:18, 18 January 2010 (UTC)
- That's true, but he does raise a fair point. CO2 is a much lower energy state than diamond and so diamond ought to spontaneously oxide in air at some rate. By observation it is presumably a very low rate, but it would be interesting to know what the rate actually is. Dragons flight (talk) 00:57, 18 January 2010 (UTC)
- Well if you heat up a diamond enough in air, it burns. Graeme Bartlett (talk) 01:02, 18 January 2010 (UTC)
- Why can't I find a video, a picture, or even just a damn paper about the combustion of diamond? I mean, people seem to regularly put thousands of dollars worth of caesium into water for fun, so burning diamond can't be that far off economically for a "wow!" moment? John Riemann Soong (talk) 14:13, 18 January 2010 (UTC)
- Youtube. Look for [burn diamond]. First two hits.Come on, man. DMacks (talk) 16:56, 18 January 2010 (UTC)
- That's funny, such a prominent first youtube result would normally show up on google ... John Riemann Soong (talk) 17:16, 18 January 2010 (UTC)
- Youtube. Look for [burn diamond]. First two hits.Come on, man. DMacks (talk) 16:56, 18 January 2010 (UTC)
- Why can't I find a video, a picture, or even just a damn paper about the combustion of diamond? I mean, people seem to regularly put thousands of dollars worth of caesium into water for fun, so burning diamond can't be that far off economically for a "wow!" moment? John Riemann Soong (talk) 14:13, 18 January 2010 (UTC)
- Well if you heat up a diamond enough in air, it burns. Graeme Bartlett (talk) 01:02, 18 January 2010 (UTC)
- That's true, but he does raise a fair point. CO2 is a much lower energy state than diamond and so diamond ought to spontaneously oxide in air at some rate. By observation it is presumably a very low rate, but it would be interesting to know what the rate actually is. Dragons flight (talk) 00:57, 18 January 2010 (UTC)
On a related note, does fluorine gas, F2, react with diamond?
Ben (talk) 01:56, 18 January 2010 (UTC)
- (WARNING: Do not try this at home.) Diamonds have a reputation of not burning. But Michael Faraday and Humphrey Davy burned a diamond back in 1814. They put it in a sealed globe full of air and heated it via the sun and a big magnifying glass to get it hot enough for the combustion to start. Once combustion started, it was self-supporting. The burning diamond gave off a brilliant scarlet light. The "carbonic acid gas" found in the globe containing the diamond after combustion indicated that diamond was carbon. They repeated the experiment with various other gasses in the globe. Edison (talk) 20:04, 18 January 2010 (UTC)
- Yeah curious what the product of diamond and fluorine is ... do you get a big chunk of solid teflon, or just lots of CF4 gas? Just how strong of an oxidiser do you have to use to get diamond to react at room temperature? Would aqua regia dissolve diamond? HF + Antimony pentafluoride? John Riemann Soong (talk) 20:41, 19 January 2010 (UTC)