About 507,000 results
Open links in new tab
  1. What is the difference between an estimator and a statistic?

    An "estimator" or "point estimate" is a statistic (that is, a function of the data) that is used to infer the value of an unknown parameter in a statistical model. So a statistic refers to the data itself …

  2. What is the relation between estimator and estimate?

    Feb 24, 2011 · In Lehmann's formulation, almost any formula can be an estimator of almost any property. There is no inherent mathematical link between an estimator and an estimand. …

  3. What is the difference between estimation and prediction?

    Sep 15, 2018 · An estimator uses data to guess at a parameter while a predictor uses the data to guess at some random value that is not part of the dataset. For those who are unfamiliar with …

  4. Estimator for a binomial distribution - Cross Validated

    Oct 7, 2011 · For bernoulli I can think of an estimator estimating a parameter p, but for binomial I can't see what parameters to estimate when we have n characterizing the distribution? …

  5. Variance of sample median - Cross Validated

    Oct 31, 2012 · Given a sample of n units out of a population of N, population median can be estimated by the sample median. How can we get the variance of this estimator?

  6. random variable - When is the median-of-means estimator better …

    May 22, 2023 · When is the median-of-means estimator better than the standard mean? Ask Question Asked 2 years, 6 months ago Modified 2 years, 1 month ago

  7. How to derive the ridge regression solution? - Cross Validated

    15 I have recently stumbled upon the same question in the context of P-Splines and as the concept is the same I want to give a more detailed answer on the derivation of the ridge …

  8. What is the difference between a consistent estimator and an …

    An estimator is unbiased if, on average, it hits the true parameter value. That is, the mean of the sampling distribution of the estimator is equal to the true parameter value.

  9. probability - Proving estimator consistency - Cross Validated

    Jan 27, 2023 · The estimator $\hat {\sigma}^2_N$ is consistent if it converges in probability to $\sigma^2$. To prove consistency it is sufficient to show that $E [ (\hat {\sigma}^2_N - …

  10. Cramer-Rao bound for biased estimators - Cross Validated

    Nov 16, 2023 · Now say you have an estimator, and you are able to do the bias variance decomposition above in relation to some test-data. How might you asses whether your …