Wikipedia:Reference desk/Archives/Mathematics/2011 May 2
Mathematics desk | ||
---|---|---|
< May 1 | << Apr | May | Jun >> | May 3 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
May 2
[edit]Product of series of a special form
[edit]When you have for even n and you expand it out, the result always appears to be a polynomial of the form i.e. all the polynomial terms less than n (except the 1) cancel out, as do the odd terms. What is the explanation for this?
Incidentally, if you take then this becomes , bolstering my suspicion that all the polynomial terms less than n cancel out (except the 1). Widener (talk) 13:34, 2 May 2011 (UTC)
- It's true for all n, not just even n. It's because this product is an even function.Sławomir Biały (talk) 13:55, 2 May 2011 (UTC)
- As for why the exponents less than n+1 cancel: The first sum is and the second is as . So their product is
- See also big O notation. In intuitive terms, the sums are very close to e^x and e^(-x) respectively, so their product needs to be very close to 1. -- Meni Rosenfeld (talk) 14:09, 2 May 2011 (UTC)
- Let me see if I understand. What you've written is basically:
- And now it is obvious that all the polynomial terms in that expression are greater than or equal to n+1 (except the 1) Widener (talk) 21:50, 2 May 2011 (UTC)
- That's correct. -- Meni Rosenfeld (talk) 08:16, 3 May 2011 (UTC)
- Let me see if I understand. What you've written is basically:
Statistics and Mutual Information
[edit]I have a couple of very short questions regarding mutual information, about which I know nothing. In fact they are two questions that a friend emailed me, hoping that I'd be able to answer. So, I'm not even too sure I understand the questions myself in their entirety.
- Has anybody already solved the problem of MI significance estimation?
- If not, is there a good way to do it; particularly to generate a null distribution?
I'd appreciate as much help as you can give. References to books and articles would be great too. Thanks in advance. — Fly by Night (talk) 16:25, 2 May 2011 (UTC)
- This problem is a son of a bitch, because it is very difficult to come up with an unbiased estimator for MI for use with limited samples. The most common approach, I believe, is bootstrapping, i. e., estimating the distribution by repeatedly simulating the null hypothesis. I have seen papers claiming that a more systematic approach is possible (in particular this paper and several that followed it), but I have never understood the calculations well enough to be confident in them. It's a bit of a shame, because there are many problems in neuroscience that are naturally approached using MI, if only one were able to do decent statistics with the results. Looie496 (talk) 23:24, 2 May 2011 (UTC)
- Perhaps better, here is a recent review of the approaches that various people have tried, at least in the context of neuroscience (but really the factors that come into play should be the same for data from any field). Looie496 (talk) 23:34, 2 May 2011 (UTC)
- Thanks a lot for that Looie. I've forwarded those papers to my friend. If you have any more suggestions then please drop me a line on my user talk page. Thanks again… — Fly by Night (talk) 00:03, 3 May 2011 (UTC)
Organization Membership Pattern
[edit]If between 1950 and 1970 an organization's membership grew by 297%, between 1970 and 1990 membership grew by 170%, and between 1990 and 2009 (n.b. NOT 2010) membership grew by 82%, in what year will membership growth reach 0%? Thanks Wikipedians! Schyler (one language) 22:50, 2 May 2011 (UTC)
- You cannot do this kind of prediction witout making some rather improbable assumptions, so the result is not to be trusted. That being said, you could assume that the number of members is some simple function of time, Nt. The data are that N1970 = 3.97*N1950, N1990 = 2.70*N1950 = 2.70*3.97*N1950 = 10.7*N1950, and N2009 = 1.82*2.70*3.97*N1950 = 19.5*N1950. Counting years from year 1950 in order to avoid too big numbers we get N20 = 3.97*N0, N40 = 10.7*N0, and N59 = 19.5*N0. To secure that no negative membership counts occur, chose the model Nt=N0eat2+bt. It remains to select the constants a and b such that the model fits the data. Taking the logarith of the data equations give 400a+20b−1.38 = 1600a+40b−2.37 = 3481a+59b−2.97 = 0. Here are three equations in only two unknowns, so there is probably no exact solution, but we can use the third equation as a check that the model is reasonable. Subtracting 2*the first equation from the second equation gives 800a+0.39 = 0, giving a=−0.000488. Subtracting 4 times the first equation from the second equation, and dividing by 40, gives b =0.0788. The result fits nicely into the third equation, 3481a+59b=2.95, so the model seems to be satisfactory. The membership growth vanishes when 2at+b=0, that is for t=-b/(2a)=80.7, and the year is 1950+80=2030. Bo Jacoby (talk) 17:01, 3 May 2011 (UTC).
Wow!! Thanks for your effort and time. This is for you! Schyler (one language) 20:37, 3 May 2011 (UTC)