Wikipedia:Reference desk/Archives/Science/2013 June 3
Science desk | ||
---|---|---|
< June 2 | << May | June | Jul >> | June 4 > |
Welcome to the Wikipedia Science Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
June 3
[edit]Is stereophonic sound really needed for future virtual reality technology?
[edit]For example, when House of Wax was released in 1953, it was in stereophonic sound. So does this mean that stereophonic sound is intended to be indistinguishable from real life sounds? Like if a phone rings in a movie, and it's indistinguishable from a real phone ringing, can't mono sound be indistinguishable from real life sounds as well? Can't virtual reality use mono/monaural sound? Mattdillon87 (talk) 01:01, 3 June 2013 (UTC)
- How many ears do you have? ←Baseball Bugs What's up, Doc? carrots→ 02:12, 3 June 2013 (UTC)
- Well yes, I understand that it's because we have two ears, but I was watching tv once and a phone on tv rang. It sounded exactly like a real phone. I thought it was my phone. Does this mean mono sound is not capable of sounding real (i.e. phone on tv/phone in real life)? Mattdillon87 (talk) 02:19, 3 June 2013 (UTC)
- It's not realism per se, it's that monophonic sound is not directional. The advantage of stereophonic sound (or its descendants Quadraphonic sound and 5.1 surround sound) is that they add directional and dimension to sound. The difference between them is not unlike the difference between normal flat movie and a 3D movie. When you hear a phone ring through a monophonic TV system, it can sound like a real phone ringing from the TV. When you hear a phone ring through a properly tuned surround sound system, it can sound like a phone ringing 4 feet to your left. To really get the true difference between stereo and mono sound, the best way is to listen to late 1960s-1970s era rock music on headphones. Get a copy the album Led Zeppelin II, especially the tracks "Whole Lotta Love" or "What Is and What Should Never Be" and listen to them on headphones first in mono AND then stereo and you'll hear what stereo sound does for adding directionality and motion to sound. The swirling guitars in "Whole Lotta Love"'s instrumental break, and the way the guitars in "What Is And What Should Never Be" seem to jump from place to place around the room really show what stereophonic sound can do for you. That's an album that is made to be appreciated on good headphones. --Jayron32 02:46, 3 June 2013 (UTC)
- A simple example: The beginning of the Beatles song "Back in the USSR" is the sound of a jet plane landing. It moves from the left headphone to both and then to the right headphone. Hard to achieve that little effect with monaural. ←Baseball Bugs What's up, Doc? carrots→ 02:52, 3 June 2013 (UTC)
- Not only is it needed, it really isn't even enough. For one thing, we hear partly with our bodies (particularly low-frequencies); for another, we have some ability to tell whether sounds come from above or below as well as left or right. Even quadraphonic sound is not really the full story. Looie496 (talk) 03:12, 3 June 2013 (UTC)
- As most people only have two working ears, one can indeed simulate any directionality with only two loudspeakers. See Gardner, William G. "Transaural 3-D Audio", MIT Media Laboratory Perceptual Computing Section Technical Report No. 342 (July 20, 1995) for an example. --Atethnekos (Discussion, Contributions) 04:03, 3 June 2013 (UTC)
- It may be possible, it certainly can't be done yet in general for all listeners for all soundfields. See the problems with HRTFs in the literature, specifically with sounds close to the symmetry plane of the head level with the eyes and up.
- So far as the OP goes, stereo sound adds greatly to the perception of immersion for a relatively small cost in bandwidth and processing power, compared with visual information, therefore it makes sense to use it. Greglocock (talk) 23:46, 3 June 2013 (UTC)
- Did that study permit the listeners to turn their heads? Wnt (talk) 16:41, 5 June 2013 (UTC)
- A few comments:
- 1) I find it much more difficult to pinpoint the direction of high frequency sounds than low. (This is particularly annoying when a fire alarm battery is low, and it gives off a high pitched beep, making it quite difficult to figure out which one it is.) If the phone sound was high frequency, this might explain why it's direction wasn't clear to you.
- 2) To really fool you, the sounds should still sound like they are coming from the desired direction if you move your head. Headphones don't really work for this, since, when you turn your head 90 degrees, it sounds like the sources of the sound also moved 90 degrees. Perhaps we could develop the technology soon to adjust for this. A stereo system with fixed speakers is a bit better, but still only fools you when you're in the "sweet spot" between the two. StuRat (talk) 06:05, 3 June 2013 (UTC)
- It's not really an issue of high versus low. You can't localize very low frequencies either -- that's why a woofer can be placed anywhere in a room. What matters is relation of the wavelength of the sound to the distance between your ears. The optimal frequency is around 1000 Hz, which gives about one half-cycle of phase difference between the two ears when the sound comes from one side. Looie496 (talk) 13:54, 3 June 2013 (UTC)
- What matters more is the spectrum. You can easily discern the direction of a broad spectrum noise (like a twig snapping). A square wave also has a fairly broad spectrum. Pure sinewave tones aren't as easy to pinpoint even if they have a short wavelength. This is a known problem with sirens on emergency vehicles. I remember reading a news article some years back about proposals to build ambulance sirens that produce a richer spectral tone to aid human directional detection, but the idea fell on deaf ears, so to speak, due to cultural expectations on what a siren sounds like. ~Amatulić (talk) 00:09, 4 June 2013 (UTC)
- It's not really an issue of high versus low. You can't localize very low frequencies either -- that's why a woofer can be placed anywhere in a room. What matters is relation of the wavelength of the sound to the distance between your ears. The optimal frequency is around 1000 Hz, which gives about one half-cycle of phase difference between the two ears when the sound comes from one side. Looie496 (talk) 13:54, 3 June 2013 (UTC)
Sun's curve
[edit]Im in the UK, but the Sun appears to travel in a curve. At sunrise it is visible from the north of my home. At noon/midday it is visible only on south. Then at sunset it again appears in the north. What causes this? Pass a Method talk 11:05, 3 June 2013 (UTC)
- It's the summer, and the UK is in a high northern latitude. You'll find that in the morning, the Sun is in the north-east, and in the evening, it's in the north-west. This is because (at this time of year) the North Pole is tilted towards the Sun, making the Sun appear higher in the sky overall for people in the Northern Hemisphere. Imagine heading north on a day close to midsummer: somewhere beyond the Arctic Circle, you'd come to a place where the sun didn't set at all, and could be seen due north at midnight. We're not that far north, but as we get closer to 21 June, you'll see the Sun for longer and longer periods, rising further and further in the north. AlexTiefling (talk) 11:26, 3 June 2013 (UTC)
- I caught myself wondering the same thing several weeks ago, but the answer is actually quite obvious once you think about it. You can basically imagine the scenario as somewhat akin to looking south toward a place where the sun is directly overhead and moves in a straight line from sunrise to sunset. As you get further and further from this point, the trajectory of the sun will appear to bow with respect to the horizon. The sun simply can't travel in a straight line from every vantage point. Evanh2008 (talk|contribs) 11:38, 3 June 2013 (UTC)
- I still dont get it. At sunset and sunrise the sun shines into my house from a northern angle. At midday it shines from a sourthern angle. What the fuck? Pass a Method talk 11:43, 3 June 2013 (UTC)
- Which part of my explanation was unclear? AlexTiefling (talk) 11:45, 3 June 2013 (UTC)
- (edit conflict) I'm not sure how else to describe it. You're assuming (or have assumed until now, at least) that the sun moves directly overhead in a straight line that is perfectly parallel to the east and west marks on a compass rose. That's a faulty premise. In truth, the sun moves along a line (curved to a degree determined by your latitude) that is almost always slightly off from the parallel. At the mid-point of the curve the light enters your sun from the south, while both "ends" of the line fall to the north. Evanh2008 (talk|contribs) 11:47, 3 June 2013 (UTC)
- I still dont get it. At sunset and sunrise the sun shines into my house from a northern angle. At midday it shines from a sourthern angle. What the fuck? Pass a Method talk 11:43, 3 June 2013 (UTC)
- I caught myself wondering the same thing several weeks ago, but the answer is actually quite obvious once you think about it. You can basically imagine the scenario as somewhat akin to looking south toward a place where the sun is directly overhead and moves in a straight line from sunrise to sunset. As you get further and further from this point, the trajectory of the sun will appear to bow with respect to the horizon. The sun simply can't travel in a straight line from every vantage point. Evanh2008 (talk|contribs) 11:38, 3 June 2013 (UTC)
- Are you talking about analemma? --TammyMoet (talk) 11:56, 3 June 2013 (UTC)
- Position of the Sun explains it, but is mainly math. Sun path is in simpler terms, but is focused on solar panels, and my quick skim of it makes it seem like it describes the path, but not why it follows the path. 209.131.76.183 (talk) 12:56, 3 June 2013 (UTC)
- Draw a picture of the Earth, as a circle. Draw the equator, as a line crossing the center of the circle at an angle of 23 degrees. Draw the Sun, as a dot far away from the circle horizontally in the direction above the equator -- that's where it is an midsummer. Draw your location on the Earth at sunrise and midday. If you look at the picture this gives you, I think you'll see why the Sun is to the north at sunrise. Looie496 (talk) 13:45, 3 June 2013 (UTC)
- Google Earth has a function that shows the sun and the illuminated part of the globe relative to the time of the day/year. I've found that for these kinds of questions, visualizing can often be more helpful than any written explanation. I think a way to put it however, is that in summer at sunset/sunrise, the sun shines directly on a spot that is more than 90 degrees east/west from the observer. In effect, one sees the sun as it were shining from over the Earth, thus creating the illusion that the sun is north of us. - Lindert (talk) 14:08, 3 June 2013 (UTC)
- In the UK, the sun will always be exactly south at 12 noon Greenwich mean time or 1 pm British Summer Time. It will never be directly north. Alansplodge (talk) 16:42, 3 June 2013 (UTC)
- Not entirely true, because the length of solar days varies with the time of year, but indeed the sun will never appear exactly north for those living between the Tropic of Cancer and the Arctic Circle. However, in summer it will appear closer to north than to south at sunset/rise, which is what the OP meant with 'in the north'. - Lindert (talk) 16:52, 3 June 2013 (UTC)
- But we're talking a few seconds of difference. Also there will be a time difference if the OP lives well to the west of the Greenwich Meridian. Each major port in the UK used to calculate its own time based on the sun's zenith (noon is when the sun is at its highest point), but it caused all kinds of problems when people started travelling by train, hence the introduction of GMT. Alansplodge (talk) 17:02, 3 June 2013 (UTC)
- ". . . few seconds of difference." Actually up to about 16 minutes; see Equation of Time. {The poster formerly known as 87.81.230.195} 212.95.237.92 (talk) 17:16, 3 June 2013 (UTC)
- I stand corrected. That's the last time I set my watch with a sundial ;-) Alansplodge (talk) 20:38, 3 June 2013 (UTC)
- You were correct in saying that the sun will never rise or set in the north in the UK, though it gets fairly close to north in the Shetlands. As you also mentioned, until the days of railway travel, each town and village used to set its clock by the sun, and this was less confusing in some ways because earliest sunrise and latest sunset then occurred on the same date (the solstice). Dbfirs 22:01, 3 June 2013 (UTC)
- I stand corrected. That's the last time I set my watch with a sundial ;-) Alansplodge (talk) 20:38, 3 June 2013 (UTC)
- ". . . few seconds of difference." Actually up to about 16 minutes; see Equation of Time. {The poster formerly known as 87.81.230.195} 212.95.237.92 (talk) 17:16, 3 June 2013 (UTC)
- But we're talking a few seconds of difference. Also there will be a time difference if the OP lives well to the west of the Greenwich Meridian. Each major port in the UK used to calculate its own time based on the sun's zenith (noon is when the sun is at its highest point), but it caused all kinds of problems when people started travelling by train, hence the introduction of GMT. Alansplodge (talk) 17:02, 3 June 2013 (UTC)
- Not entirely true, because the length of solar days varies with the time of year, but indeed the sun will never appear exactly north for those living between the Tropic of Cancer and the Arctic Circle. However, in summer it will appear closer to north than to south at sunset/rise, which is what the OP meant with 'in the north'. - Lindert (talk) 16:52, 3 June 2013 (UTC)
- Just to clarify, Alan. Greenwich Mean Time (GMT) was established in 1675, when the Royal Observatory was built, as an aid to English mariners to determine longitude at sea. It provided a standard maritime reference time, at a time when each city in England kept a different local time. The use of GMT as the standard terrestrial time zone for Great Britain – and this is what you're talking about @ "the introduction of GMT" – stems from 1880 (officially, but it had been gaining traction in practice for some time prior to that).
- Remember, the very idea of having the world divided into standard time zones was new at that time. In 1884, the International Meridian Conference adopted a universal day of 24 hours beginning at Greenwich midnight, but specified that it "shall not interfere with the use of local or standard time where desirable". Most countries still operated under local time from town to town. And even where a country adopted one or more standard time zones, they were not necessarily related to GMT by an offset of a neat number of hours or half-hours. The system was not really generally entrenched until 1929, and Nepal didn't become GMT-compliant until 1986! -- Jack of Oz [Talk] 21:55, 3 June 2013 (UTC)
- You are at a point on the surface of the sperical(ish) earth, this point is moving in a curved line. The sun is, for the purpose of this exercise, 'still'. You are the one moving in a curve but your local reference points are also moving in the same curve which makes it appear that the sun is moving - and in a curved pathway. (and I would check those compass bearings if I were you) Richard Avery (talk) 07:08, 4 June 2013 (UTC)
- I thnk i get it now. In my mind I pictured a mini equator over the UK - on a sphere, it would curve upwards when looking to the east. Simple language for simple people :) Pass a Method talk 08:29, 4 June 2013 (UTC)
Particle decelerator, is it possible?
[edit]There is such thing as particle accelerator that converts electricity into electron kinetic energy, but is there the other part, which is a thing that can convert electron kinetic energy into electricity? — Preceding unsigned comment added by 118.136.5.235 (talk) 11:19, 3 June 2013 (UTC)
- Yes, it is possible to create a device that works in reverse to a particle accelerator. Plasmic Physics (talk) 11:22, 3 June 2013 (UTC)
- Well I have do a small amount of googling about particle decelerator, but nothing good has come up 118.136.5.235 (talk) 11:30, 3 June 2013 (UTC)
- That would be because a device under that name does not exist. Plasmic Physics (talk) 12:07, 3 June 2013 (UTC)
- So what is the proper term for particle decelerator? http://uncyclopedia.wikia.com/wiki/UnNews:Particle_Decelerator_test_run_proves_successful 118.136.5.235 (talk) 12:22, 3 June 2013 (UTC)
- Slowing particles is called Particle beam cooling. There's not much call for generating power this way as there's no cheap source of energetic particles though I suppose one could call a nuclear power station a very simple version of it. Dmcq (talk) 13:14, 3 June 2013 (UTC)
- So what is the proper term for particle decelerator? http://uncyclopedia.wikia.com/wiki/UnNews:Particle_Decelerator_test_run_proves_successful 118.136.5.235 (talk) 12:22, 3 June 2013 (UTC)
- Energetic electrons are also termed "beta radiation". You can block (ie bring to zero speed...or "decelerate") beta radiation with just a few millimeters of metal - although you might need more for very high speed electrons. So a good "electron decelerator" consists of a half inch thick slab of lead.
- The energy produced would heat up the metal and gradually give it a negative charge. So if you had a ready source of high speed electrons, a slab of metal and a means to convert heat to generate power isn't hard. The small amount of negative charge produced could be used to produce electricity just by hooking up a wire.
- But as Dmcq says - without a source of such particles, there isn't going to be much call for it. SteveBaker (talk) 14:01, 3 June 2013 (UTC)
- If you decelerate energetic electrons in that way - by stopping them with a slab of impenetrable metal or ceramic - you get bremsstrahlung radiation, in addition to any residual charge collected on your device. It's not a very efficient or safe way to convert kinetic energy to useful electrical energy. Nimur (talk) 15:13, 3 June 2013 (UTC)
- However, it IS an efficient way of converting kinetic energy to X-ray radiation -- which is why most X-ray machines use precisely this method. 24.23.196.85 (talk) 00:51, 4 June 2013 (UTC)
- If you decelerate energetic electrons in that way - by stopping them with a slab of impenetrable metal or ceramic - you get bremsstrahlung radiation, in addition to any residual charge collected on your device. It's not a very efficient or safe way to convert kinetic energy to useful electrical energy. Nimur (talk) 15:13, 3 June 2013 (UTC)
- ...And why on Earth would you want to generate electricity using charged particles, other than a proof of concept? Plasmic Physics (talk) 14:33, 3 June 2013 (UTC)
- Well, not all electricity is generated as a source of motive power. You might (for example) measure the electricity produced as a means to determine how fast the particle stream is moving - or what direction it's coming from. Yes, of course there are better ways to do that - but there are certainly multiple reasons why one might want to create electricity from a particle beam that are not related to power production. SteveBaker (talk) 18:50, 3 June 2013 (UTC)
- I meant generating electricity for the sake of generating electricity. If that is not what the OP meant, then of course, you're right. Plasmic Physics (talk) 02:51, 4 June 2013 (UTC)
- Well if you ask me why do I want electricity from particles, it because I want to recover energy from free electron laser electron output. They said it can make the laser more efficient instead of just slamming it into beam dump 118.136.5.235 (talk) 08:20, 4 June 2013 (UTC)
- Why not just collect the electrons directly by completing the circuit? See Crookes tube. Plasmic Physics (talk) 14:03, 4 June 2013 (UTC)
Let me refer to Direct conversion and Aneutronic fusion. 81.11.190.37 (talk) 05:24, 6 June 2013 (UTC)
- Well if we use it to decelerate electron we surely just need to change the anode voltage to negative right?118.136.5.235 (talk) 07:14, 7 June 2013 (UTC)
NICU
[edit]I am researching the number of neonatal intensive care units (NICU) around the world and would appreciate any leads. Melisse May (talk) 14:54, 3 June 2013 (UTC)Melisse_May
- The World Health Organization freely publishes a large repository of data and statistics. You might find some of their data and estimates useful. Nimur (talk) 16:04, 3 June 2013 (UTC)
Personality trait
[edit]Is there a name for people (children especially) who need to know exactly what is going to happen next and get anxious when they can't find out? It came up in conversation - I'm not seeking medical advice. Alansplodge (talk) 16:53, 3 June 2013 (UTC)
- I think there might not be a better name than "insecurity". Looie496 (talk) 17:06, 3 June 2013 (UTC)
- The DSM is the authoritative reference for standard terminology regarding personality disorders. The new edition, DSM 5, was just published last week, providing updated diagnostic terminology since the previous revision (DSM IV) in 2000. Perhaps a psychology expert can comment whether there has been any change in this area. Nimur (talk) 17:29, 3 June 2013 (UTC)
- How about "practical" - less likely to be taken in by salesmen and other con artists, because they would grow impatient with the "spiel". ←Baseball Bugs What's up, Doc? carrots→ 20:03, 3 June 2013 (UTC)
- Nonsense. Wikipedia defines "practical" (in this sense) as "Of a person, having skills or knowledge that are practical" - which by the other definition of "practical" means "Being likely to be effective and applicable to a real situation; able to be put to use."...or..."Based on practice or action rather than theory or hypothesis" - neither of which has remotely anything to do with people who are overly concerned with what's going to happen next. Please, at least try to reference some actual information before posting this kind of nonsense! Just guessing isn't acceptable here. SteveBaker (talk) 20:52, 3 June 2013 (UTC)
- I resent being called "nonsense". But feel free to try to answer the OP's question, rather than attacking other users. ←Baseball Bugs What's up, Doc? carrots→ 22:51, 3 June 2013 (UTC)
- Nonsense. Wikipedia defines "practical" (in this sense) as "Of a person, having skills or knowledge that are practical" - which by the other definition of "practical" means "Being likely to be effective and applicable to a real situation; able to be put to use."...or..."Based on practice or action rather than theory or hypothesis" - neither of which has remotely anything to do with people who are overly concerned with what's going to happen next. Please, at least try to reference some actual information before posting this kind of nonsense! Just guessing isn't acceptable here. SteveBaker (talk) 20:52, 3 June 2013 (UTC)
- Inability to cope with changes to routine tends to be a feature of autistic spectrum disorders. --TammyMoet (talk) 20:15, 3 June 2013 (UTC)
- Is that what the OP is describing? It doesn't sound like the same thing. ←Baseball Bugs What's up, Doc? carrots→ 20:24, 3 June 2013 (UTC)
- I know it can be an autistic spectrum trait - I wondered if it had a "label" when it occurred in isolation, which seems to me to be fairly common. Alansplodge (talk) 20:31, 3 June 2013 (UTC)
- This thread comes dangerously close to offering a diagnosis. There isn't any term for the precise symptom you're asking about - but describing someone with this system as "autistic" or anything else is to propose a diagnosis based on a symptom - and we're not allowed to do that here. SteveBaker (talk) 20:52, 3 June 2013 (UTC)
- If it's common, maybe the right term is "normal". ←Baseball Bugs What's up, Doc? carrots→ 20:38, 3 June 2013 (UTC)
- Another nonsense answer. It's "common" to catch a cold - but it's not "normal". Besides, what evidence are you presenting that this desire to continually know what's about to happen is in any way common? If you don't actually know the answer - just find another question. Thanks. SteveBaker (talk) 20:52, 3 June 2013 (UTC)
- I assume you're addressing Alansplodge, since it was he who labeled it "fairly common". And I dispute the notion that catching a cold is somehow not normal - it happens to a large percentage of the public. I do agree that we run the risk of meandering into the diagnosis area. ←Baseball Bugs What's up, Doc? carrots→ 22:49, 3 June 2013 (UTC)
- Another nonsense answer. It's "common" to catch a cold - but it's not "normal". Besides, what evidence are you presenting that this desire to continually know what's about to happen is in any way common? If you don't actually know the answer - just find another question. Thanks. SteveBaker (talk) 20:52, 3 June 2013 (UTC)
- I know it can be an autistic spectrum trait - I wondered if it had a "label" when it occurred in isolation, which seems to me to be fairly common. Alansplodge (talk) 20:31, 3 June 2013 (UTC)
- How about Risk aversion? or Ambiguity aversion--Digrpat (talk) 20:36, 3 June 2013 (UTC)
- Is that what the OP is describing? It doesn't sound like the same thing. ←Baseball Bugs What's up, Doc? carrots→ 20:24, 3 June 2013 (UTC)
- In vox populi, you call that a control freak. Titoxd(?!? - cool stuff) 07:18, 4 June 2013 (UTC)
- Thank you all. The top answer (in my humble opinion) was "uncertainty aversion" linked by Digrpat above. Alansplodge (talk) 12:27, 4 June 2013 (UTC)
Why don't chocolate coated nuts have a flat spot?
[edit]I'm sitting here contemplating the perfection of a dark-chocolate coated almond, with a perfectly smooth compound curve at every point on the chocolate's surface, un-marred by any flat spot that would have formed by the almond resting on a surface as the chocolate coating cooled and hardened.
How was this feat accomplished?
Mind you, I wrote the enrober article. It was a subject that interested me at the time. But in my research for that article, all examples of chocolate enrobers involved the coated item resting on a flat surface while the chocolate hardens. This is true of all candy bars and other chocolate coated confections. They all have a flat side.
Except these almonds.
I also have a bag of chocolate coated cashews from the same store. Rather than having an ellipsoid almond shape, these nuts are more crescent-shaped with a saddle point of two opposing curvatures. At the saddle point I see a little crease in the chocolate, which could be a clue, almost as if the nut was resting on a blade as the chocolate cooled. More likely, I think, is that this crease formed naturally as a stress fracture in the chocolate surface as it shrank at the saddle point.
I hope someone can shed some light. ~Amatulić (talk) 23:59, 3 June 2013 (UTC)
- It is rather simple to roll them and polish off the edges video. μηδείς (talk) 00:27, 4 June 2013 (UTC)