Talk:Taylor expansions for the moments of functions of random variables
This article is rated Stub-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
It is not obvious to me why the truncation of the Taylor expansion would give a good approximation. In fact, it seems to be of the order of the third moment, which could be huge. Thus, the approximation would only hold when the first or second order dominates, or am I missing something? Filur 10:19, 14 February 2006 (UTC)
- I've been unable to find a reference on this, but if you are talking about the third central moment, then I believe you are indeed correct. I do know that the number of terms needed for a good approximation does have something to do with the behavior of the moments. Feel free to add this to the article. By the way, I'm still unhappy with the name; I welcome any suggestions. Btyner 03:05, 23 February 2006 (UTC)
You may well be missing something (but I'm going to look more closely....). What I'm thinking is that the random variable may be a sample average, in which case effects of higher moments might go away as the sample size grows. But for the moment I'm shooting off my mouth without having looked closely, so more later.... Michael Hardy 02:35, 26 August 2007 (UTC)
I think this kind of Taylor series is dependend on the function itself as well. If one would have the function f(x) = x^2, I'd imagine things to go wrong for a Gaussian distributed variable x as well, as the outcome of the function doesn't have a symmetric distribution anymore. I'm using this theory on my experiments and as they are nice linear functions everything goes well. I would like to properly derive all equations though, so I can state which conditions both x and f need to fullfill to be applicable. Does anyone have a good reference, with a proper discussion on the Taylor series expansion to calculate the VAR(F(x))? Thx in advance. —Preceding unsigned comment added by Ddeklerk (talk • contribs) 07:26, 3 September 2007
The only thing I can find is this StackExchange post, which is quite useful, and brings up the points raised here: https://stats.stackexchange.com/questions/70490/taking-the-expectation-of-taylor-series-especially-the-remainder If I find time I'll try to integrate some of these points into the page.... Trust but verify (talk) 03:35, 4 January 2019 (UTC)
The central limit theorem is used to often support approximation using Gaussian/Normal distributions. The CDF of the normal distribution is very cubic. The PDF is the derivative of it and is a quadratic. The pdf space can be the sample space, consider the empirical pdf (aka histogram). This means that a Taylor series truncated to second order is exact in the limit for many useful cases. Also keep in mind the definitions of the working domain: probability density functions. The maximum probability (aka integral from -inf to inf of f(x)) is, by axiom, one. Your f(x) = x^2 is not a pdf. EngrStudent 15 June 2010
(UTC)
This article looks very sketchy to me. For a start, the "approximations" are never formally discussed: what is the rigorous meaning of the "approximation" here? The article is basically a mix of vaguely-rigorous-looking "assumptions", and statements that are not detailed at all and could mean many different things, without a proof; the reference [2] is a "Technical Report from Automatic Control" (not peer-reviewed, not published?), so hardly a reference at all... Clément Canonne (talk) 00:29, 2 June 2019 (UTC)