Point estimation
In statistics, point estimation involves the use of sample data to calculate a single value (known as a statistic) which is to serve as a "best guess" or "best estimate" of an unknown (fixed or random) population parameter.
More formally, it is the application of a point estimator to the data.
In general, point estimation should be contrasted with interval estimation: such interval estimates are typically either confidence intervals in the case of frequentist inference, or credible intervals in the case of Bayesian inference.
Point estimators
- minimum-variance mean-unbiased estimator (MVUE), minimizes the risk (expected loss) of the squared-error loss-function.
- best linear unbiased estimator (BLUE)
- minimum mean squared error (MMSE)
- median-unbiased estimator, minimizes the risk of the absolute-error loss function
- maximum likelihood (ML)
- method of moments, generalized method of moments
Bayesian point-estimation
Bayesian inference is typically based on the posterior distribution. Many Bayesian point-estimators are the posterior distribution's statistics of central tendency, e.g., its mean, median, or mode:
- Posterior mean, which minimizes the (posterior) risk (expected loss) for a squared-error loss function; in Bayesian estimation, the risk is defined in terms of the posterior distribution, as observed by Gauss.[1]
- Posterior median, which minimizes the posterior risk for the absolute-value loss function, as observed by Laplace.[2][3]
- maximum a posteriori (MAP), which finds a maximum of the posterior distribution; for a uniform prior probability, the MAP estimator coincides with the maximum-likelihood estimator;
The MAP estimator has good asymptotic properties, even for many difficult problems, on which the maximum-likelihood estimator has difficulties. For regular problems, where the maximum-likelihood estimator is consistent, the maximum-likelihood estimator ultimately agrees with the MAP estimator.[4][5][6] Bayesian estimators are admissible, by Wald's theorem.[5][7]
The Minimum Message Length (MML) point estimator is based in Bayesian information theory and is not so directly related to the posterior distribution.
Special cases of Bayesian filters are important:
Several methods of computational statistics have close connections with Bayesian analysis:
Properties of point estimates
See also
Notes
- ↑ Dodge, Yadolah, ed. (1987). Statistical data analysis based on the L1-norm and related methods: Papers from the First International Conference held at Neuchâtel, August 31–September 4, 1987. Amsterdam: North-Holland Publishing Co.
- ↑ Dodge, Yadolah, ed. (1987). Statistical data analysis based on the L1-norm and related methods: Papers from the First International Conference held at Neuchâtel, August 31–September 4, 1987. Amsterdam: North-Holland Publishing Co.
- ↑ Jaynes, E.T. (2007). Probability theory : the logic of science (5. print. ed.). Cambridge [u.a.]: Cambridge Univ. Press. p. 172. ISBN 978-0-521-59271-0.
- ↑ Ferguson, Thomas S (1996). A course in large sample theory. Chapman & Hall. ISBN 0-412-04371-8.
- 1 2 Le Cam, Lucien (1986). Asymptotic methods in statistical decision theory. Springer-Verlag. ISBN 0-387-96307-3.
- ↑ Ferguson, Thomas S. (1982). "An inconsistent maximum likelihood estimate". Journal of the American Statistical Association. 77 (380): 831–834. doi:10.1080/01621459.1982.10477894. JSTOR 2287314.
- ↑ Lehmann, E.L.; Casella, G. (1998). Theory of Point Estimation, 2nd ed. Springer. ISBN 0-387-98502-6.
Bibliography
- Bickel, Peter J. & Doksum, Kjell A. (2001). Mathematical Statistics: Basic and Selected Topics. I (Second (updated printing 2007) ed.). Pearson Prentice-Hall.
- Lehmann, Erich (1983). Theory of Point Estimation.
- Liese, Friedrich & Miescke, Klaus-J. (2008). Statistical Decision Theory: Estimation, Testing, and Selection. Springer.