Wikipedia:Reference desk/Archives/Mathematics/2017 November 11
Mathematics desk | ||
---|---|---|
< November 10 | << Oct | November | Dec >> | Current desk > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
November 11
[edit]Twin prime conjecture with possibly (?) trivial proof
[edit]I'm looking for a proof of the following: For all positive integers k, there exists some positive integer n such that n-1, n+1, k*n+1, and k*n-1 are all primes. I'm not sure if there's an easy proof or if this depends on some unproven conjecture. 68.0.147.114 (talk) 06:29, 11 November 2017 (UTC)
- This statement implies the twin prime conjecture, so if it's true, a proof would be at least as difficult as one for the TPC. --Deacon Vorbis (talk) 06:56, 11 November 2017 (UTC)
- And it's a special case of Dickson's conjecture. OEIS:A185145 gives the smallest n value for each k and says it has been verified for k up to 107. PrimeHunter (talk) 02:56, 15 November 2017 (UTC)
Fourier transform of unit step function
[edit]The fourier transform of unit step function by direct formula is giving 1/jw, but this is not correct. Why cannot I directly use the basic definition of fourier transform? 223.184.47.150 (talk) 06:34, 11 November 2017 (UTC)
- I am not sure what you mean by "direct formula"? If you mean
- this limit does not exist. Ruslik_Zero 08:27, 12 November 2017 (UTC)
- I want to know how does the limit not exist? On putting L = infinity in final result, it gives 1/jw. Please, correct me. 14.139.241.85 (talk) 10:15, 15 November 2017 (UTC)
- The limit isn't defined by plugging in infinity and seeing what comes out. In fact, the definition of what a limit at a value means never actually references the actual value of the function is at that value. Furthermore infinity is not a real number that you can plug in anyway. You have to actually look at the behaviour of the function as L increases without bound. Now, does it "settle down" on anything? Evidently not, so the limit does not exist. Double sharp (talk) 23:49, 15 November 2017 (UTC)
- As far as I can tell though, if j and w are constants then we can make the expression arbitrarily close to by choosing L sufficiently large (in the sense of the formal definition of a limit). Or are j and w not constants?--Jasper Deng (talk) 09:53, 16 November 2017 (UTC)
- From context, I assume that j is the imaginary unit. It is most commonly written as i, but j is a less common alternative notation. Dragons flight (talk) 10:39, 16 November 2017 (UTC)
- In which case I side with the notion that the limit is nonexistent. I had thought j was real.--Jasper Deng (talk) 11:06, 16 November 2017 (UTC)
- To the OP, a simple application of Euler's formula shows that what you really have is sine and cosine, which oscillate without limit as their arguments tend to infinity.--Jasper Deng (talk) 11:26, 16 November 2017 (UTC)
- What I learned from this discussion is that the limit doesn't exist because j is imaginary; if j were real, then the limit will exist. 14.139.241.85 (talk) 07:13, 17 November 2017 (UTC)
- From context, I assume that j is the imaginary unit. It is most commonly written as i, but j is a less common alternative notation. Dragons flight (talk) 10:39, 16 November 2017 (UTC)
- As far as I can tell though, if j and w are constants then we can make the expression arbitrarily close to by choosing L sufficiently large (in the sense of the formal definition of a limit). Or are j and w not constants?--Jasper Deng (talk) 09:53, 16 November 2017 (UTC)
- The limit isn't defined by plugging in infinity and seeing what comes out. In fact, the definition of what a limit at a value means never actually references the actual value of the function is at that value. Furthermore infinity is not a real number that you can plug in anyway. You have to actually look at the behaviour of the function as L increases without bound. Now, does it "settle down" on anything? Evidently not, so the limit does not exist. Double sharp (talk) 23:49, 15 November 2017 (UTC)
- I want to know how does the limit not exist? On putting L = infinity in final result, it gives 1/jw. Please, correct me. 14.139.241.85 (talk) 10:15, 15 November 2017 (UTC)
"Basis" solutions for nonlinear partial differential equations
[edit]Oftentimes, we can't hope to write down closed-form solutions for nonlinear partial differential equations like the Navier-Stokes equations, but is it possible that there exists a set of solutions from which all other solutions can be obtained via application of finitely many elementary functions? In particular, if that set is finite, then in principle, one could numerically precompute values for those solutions and then just transform from them to get any other solution.--Jasper Deng (talk) 07:11, 11 November 2017 (UTC)
A sequence of probabilities
[edit]In a game I'm playing your chance of success in each stage is (n - stage number) / n. n is 9 so it's 9/9,8/9,7/9...1/9. I'm wondering if this kind of sequence has a name, is discussed anywhere in WP and what the expected number of attempts is to complete the game. Scourge of Empires (talk) 14:57, 11 November 2017 (UTC)
- Well, that is an arithmetic progression, but there could be a more specific term.
- The chance of successfully completing all n stages is the product of the probability of success at each stage (assuming independence), hence is . Stirling's approximation suggests that (for large numbers) this is close to , which decreases quite fast as n increases. The expected (mean) number of attempts to finish the game is simply the inverse of that probability. TigraanClick here to contact me 15:10, 11 November 2017 (UTC)
- (edit conflict) It's a special case of a Markov chain. Stage k has an (n − k)/n chance of proceeding to stage k + 1, and a k/n probability of going back to sage 0 (how I'm interpreting what you wrote). As for expected number of attempts, that depends if you count an attempt each time you start from the beginning (so you could potentially finish in 1 attempt), or if you count each stage as an attempt (so it would take at least 9 at a minimum). For the former, it's easy: playing the game is a Bernoulli random variable where the probability, p, of success is just the product of successes at each stage (assuming stages are independent). This is just 9!/99, or about 0.000937. The expected number of attempts before a success for a Bernoulli random variable with success probability p is just 1/p, so in this case, it would be about 1068 attempts (rounding to the nearest whole number). For the latter case, it's a bit more complicated, but one way (possibly not the easiest way) is to solve the system of equations
- For Here, is the expected number of stages to attempt (starting at stage k) before success. --Deacon Vorbis (talk) 15:27, 11 November 2017 (UTC)
-
- This turns out to have a fairly nice solution in general, and in this case in particular (for n stages):
- So for n = 9, this gives about 4759 expected stages to play. Given the niceness of the solution, I suspect there's some sort of slick probabilistic argument, but I don't see one. --Deacon Vorbis (talk) 19:10, 11 November 2017 (UTC)
- You have attempts at the final stage, attempts at the 8th stage, attempts at the 7th stage etc. and each time you finish k-1 stages and bounce out at kth, that means you made k attempts so there's the 1:1 correspondence between number of attempts and number of moves. 93.136.61.217 (talk) 20:12, 12 November 2017 (UTC)
- This turns out to have a fairly nice solution in general, and in this case in particular (for n stages):