Jump to content

Talk:Bias of an estimator/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1

How to derive the bias of the sample skewness and kurtosis given a certain probability distribution?

You very nicely showed that the expected value of the uncorrected sample variance does not equal the population variance σ2, unless multiplied by a normalization factor. My question is could you give a similar derivation for the third central moment and the fourth central moment.

Work needed on 'Estimating a Poisson probability' example

I was looking over this article, and the section on estimating a Poisson probability appears quite messy and disjointed. I do not currently have the time to clean this up, but I wanted to bring it to others' attention. Wolf87 23:53, 11 February 2007 (UTC)

The only thing that I immediately notice that needs improvement in that section is that it's a bit silly to make an issue of mean squared error being large after a far more damaging case against unbiasedness (in this case) has been presented. So I wonder if you can be more specific? Michael Hardy 00:57, 12 February 2007 (UTC)

Bayesian critique ?

Should this article include a Bayesian critique of unbiasedness ?

If I understand it correctly, the test for an unbiased estimator is that

But that is quite a different thing from being able to say that

Of course for frequentists the latter proposition is difficult, because to evaluate it requires the specification of a prior probability for θ.

But, for example, in some stuff I'm working through at the moment there's a case where an unbiased estimator is known and widely trumpeted as such; but where on every reasonable prior for the system, including a uniform prior and a Jeffreys' prior, the so-called unbiased estimator is guaranteed to estimate low of the mean, every time.

And yet people treat unbiasedness as a fetish, or a wonderful comfort blanket. The word 'unbiased' is repeated again and again in the literature I'm reading for the problem. The word seems to block any understanding of the systematic error that an 'unbiased' estimator can lead to. Jheald 19:39, 26 February 2007 (UTC)

unbiased ?

An estimator or decision rule having nonzero bias is said to be biased. Should the second sentecein the first paragraph read 'unbiased' rather than 'bisaed'?

No. If the bias is nonzero, then it's biased. If the bias is zero then it's unbiased. Michael Hardy 21:59, 11 June 2007 (UTC)

two problems

1. The variance of a population given in the examples section is only correct for a uniformly distributed population.

2. The same example section suggests that the biased estimator of the variance is "more useful" than the unbiased estimator merely because we divide by a bigger number. This seems absurd. Am I missing something? 98.224.223.201 (talk) 00:03, 27 January 2008 (UTC)

Now I see you did this (see the comment below - 'removal of some text'). Could someone rewrite part of what is necessary? —Preceding unsigned comment added by Ecov (talkcontribs) 17:10, 9 December 2008 (UTC)

removal of some text

This was removed: (difference), without mentioning a reason. I think a link to Bessel's correction is appropriate. Ecov (talk) 11:36, 11 November 2008 (UTC)

Question/Remark on example section

It seems to me that and are mixed up. —Preceding unsigned comment added by 194.95.63.248 (talk) 08:50, 8 December 2008 (UTC)

Explanation needed

I'm looking at this:

...because

I don't see where

comes from. Could someone explain? —Ben FrantzDale (talk) 15:55, 19 February 2009 (UTC)

There are probably 100 references for this claim (it does not need citing). PDBailey (talk) 02:15, 20 February 2009 (UTC)
It might not citing, but as Ben said, it does need explaining because right now it looks like either a non-sequitur or silly. Unless you already know where it comes from, in which case of course you don't need further explanation.
68.7.44.17 (talk) 09:27, 26 February 2009 (UTC)
Exactly. I don't doubt it, but I'm trying to remember why it is true and to me, it isn't obvious right now. —Ben FrantzDale (talk) 21:56, 26 February 2009 (UTC)

The way I think about it is this: Project the vector

orthogonally onto the space

getting

where

Now represent all this with respect to a different orthonormal basis of the vector space: a unit vector in the direction of

and n − 1 unit vectors orthogonal to it. Then look at all the vectors involved, in both coordinate systems:

Michael Hardy (talk) 15:22, 27 February 2009 (UTC)

Michael, that is the best explanation of this I have ever seen! Thank you! I think the very first step could be rephrased. (I had trouble understanding what you meant by "projecting" X.) But this explanation, along with pictures, would be a big help for one of the pages that discusses this issue, either this page or sample standard deviation.
If I read this right, if we knew a-priori E(X) = Y, then would the best unbiased estimate of the standard deviation be the standard deviation be
as opposed to divided by , since the Us wouldn't include a zero vector?
Thanks again! —Ben FrantzDale (talk) 19:34, 27 February 2009 (UTC)

If we knew that E(X) = μ, then

would be an unbiased estimate of the variance. Not of the standard deviation. I've seldom seen anyone mention unbiased estimation of stadard deviations in this context. Michael Hardy (talk) 22:13, 28 February 2009 (UTC)

...how about this rephrasing:
...orthogonally project the vector...
Michael Hardy (talk) 02:32, 1 March 2009 (UTC)
There is a proof on the OLS/Proofs page, in a slightly more general case of regression with p regressors. Sample variance is a case of regression onto a column of constants: X=ι. // Stpasha (talk) 23:45, 4 July 2009 (UTC)

No bias function

There is no definition of the bias function here, like Bias(x,x_0)=E[x..... blah]. This page was linked to and there is no definition of the function

Thouliha (talk) 18:03, 24 April 2009 (UTC)

Bias is welcome

It seems that recently this sentence was added in the beginning:

Although the term bias sounds pejorative, nonzero bias is tolerated and sometimes even welcome in statistics.

I can accept that bias larger than zero can be accepted, but I am hard pressed to see the point that bias is welcome. You can conceive of situations in which a biased estimator has a much smaller variance (as implied by the picture that was recently added). That does not make bias desirable per se. So I think this should be reworded.

Also, I think the graphic could be improved. It is my understanding that this should show two estimators, their expected value, and their variance. Right now this looks like some sort of Venn Diagram. I believe a graph showing expected values and standard error in forms of distributions of two estimators (along with the true value) would be much more helpful. —Preceding unsigned comment added by 149.169.85.100 (talk) 05:34, 10 July 2009 (UTC)

I could contribute such a graph (made in SAS), but don't have upload privileges in wiki yet. --Felize2000 (talk) 06:22, 10 July 2009 (UTC)

That's kind of you to offer. You're encouraged to upload user-created images to Wikimedia Commons, and Commons also has the advantage that you can upload images straight away after creating an account. Qwfp (talk) 10:05, 10 July 2009 (UTC)
I've just rewritten and expanded that sentence in the lead containing "sometimes even welcome" to try to make it a better summary of some of the later sections:
"Although the term bias sounds pejorative, there are situations in which a biased estimator is preferable to any unbiased estimator. In addition, the property of unbiasedness is not invariant to non-linear transformations."
Qwfp (talk) 10:28, 10 July 2009 (UTC)
I'm sorry that the original sentence was imperfect. The added content is objectionable, because median-unbiased estimators are invariant. Kiefer.Wolfowitz (talk) 13:00, 10 July 2009 (UTC)
I updated the last paragraph. It would be useful to add a paragraph mentioning that sometimes Bayesian or maximum-likelihood estimators are preferable, as recognized since the works of Charles Stein in the 1950s (or Laplace even earlier, it is reported!). Kiefer.Wolfowitz (talk) 13:18, 10 July 2009 (UTC)

Biased picture against mean-unbiasedness

It's wrong to have a geometric picture in the introduction, giving an abstract example of where mean-bias is okay, before the civilian reader has a chance to understand the concepts. This picture should be moved below or just removed, the latter being my preference. (I hid this picture, allowing its easy restoration if I'm wrong. Kiefer.Wolfowitz (talk) 13:52, 10 July 2009 (UTC)) Kiefer.Wolfowitz (talk) 14:02, 10 July 2009 (UTC)

I agree that having that picture in introduction section is probably not the most appropriate. However the question of "why bias is not necessarily bad" is counterintuitive and thus deserves thorough explanation preferably in lay-person terms with illustrations and such. Since right now the only place this question is discussed is in the lead section, I had to put my picture there. I'd agree that it'll be better to move the picture below once the "below" emerges from nothingness :)
As for the previous comment that it will be more "obvious" to have two distribution functions of one-dimensional random variables, I have to disagree. With two distribution functions, one being short, fat and unbiased, while the other tall, thin and biased, it won't be obvious whether the latter is actually better or worse than the former. // Stpasha (talk) 05:01, 15 July 2009 (UTC)
I agree that the picture does not belong there, it is also very confusing. The estimator is an oval? How can an oval be biased or unbiased? Perhaps the text regarding it could be cleared up and this would not be a problem. PDBailey (talk) 17:33, 15 July 2009 (UTC)
Well the idea was that the ellipses represent the area where density function of the estimator is concentrated, assuming the normal distribution. It could be interpreted as either a confidence region at standard 95% level, or as variance-covariance matrix of the estimator (which is 66% confidence region). In this case the center of the ellipse is the expected value of the estimator, and the fact that one of the ellipses is strictly inside the other allows us to infer that the smaller one is "better" w.r.t. MSE... // Stpasha (talk) 22:30, 15 July 2009 (UTC)
I applaud the initiative. However, I think that this example would be tough for anybody who doesn't understand covariance matrices, and there are simpler examples to explain that mean-unbiasedness can be sometimes sacrified. Thanks for replying thoughtfully, as usual! Best regards, Kiefer.Wolfowitz (talk) 20:53, 17 July 2009 (UTC)

Sign of maximum-likelihood estimator bias?

In "Estimating a Poisson Probability", the article states:

The bias of the maximum-likelihood estimator is:

From the definition of bias, I get:

which comes out with the opposite sign.Gnartbx (talk) 15:26, 7 April 2010 (UTC)

Proof of Sample Variance Biasedness

From the penultimate to the last line of the proof in that section: Doesn't the cross term disappear because the expectation enters the bracket? Leaving the expectation on the outside is confusing, I feel. JayMidas (talk) 13:25, 3 May 2016 (UTC)

Hello fellow Wikipedians,

I have just modified one external link on Bias of an estimator. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 15:49, 19 July 2017 (UTC)