# Estimators

## Estimators Topics

Sort by:

### Maximum likelihood

Maximum likelihood, also called the maximum likelihood method, is the procedure of finding the value of one or more parameters for a given statistic which makes the known likelihood distribution a maximum. The maximum likelihood estimate for a parameter is denoted .For a Bernoulli distribution,(1)so maximum likelihood occurs for . If is not known ahead of time, the likelihood function is(2)(3)(4)where or 1, and , ..., .(5)(6)Rearranging gives(7)so(8)For a normal distribution,(9)(10)so(11)and(12)giving(13)Similarly,(14)gives(15)Note that in this case, the maximum likelihood standard deviation is the sample standard deviation, which is a biased estimator for the population standard deviation.For a weighted normal distribution,(16)(17)(18)gives(19)The variance of the mean isthen(20)But(21)so(22)(23)(24)For a Poisson distribution,(25)(26)(27)(28)..

### Estimator bias

The bias of an estimator is defined as(1)It is therefore true that(2)(3)An estimator for which is said to be unbiased estimator.

### Estimator

An estimator is a rule that tells how to calculate an estimate based on the measurements contained in a sample. For example, the sample mean is an estimator for the population mean .The mean square error of an estimator is defined by(1)Let be the estimator bias, then(2)(3)(4)where is the estimator variance.

### Wald's equation

Let , ..., be a sequence of independent observations of a random variable , and let the number of observations itself be chosen at random. Then Wald's equation states that the expectation value of the sum is equal to the expectation value of times the expectation value of ,(Wald 1945, Blackwell 1946, Wolfowitz 1947).

### Unbiased estimator

A quantity which does not exhibit estimator bias. An estimator is an unbiased estimator of if

### Fisher's estimator inequality

Given an unbiased estimator of so that . Thenwhere is the variance.

### Expectation value

The expectation value of a function in a variable is denoted or . For a single discrete variable, it is defined by(1)where is the probability density function.For a single continuous variable it is defined by,(2)The expectation value satisfies(3)(4)(5)For multiple discrete variables(6)For multiple continuous variables(7)The (multiple) expectation value satisfies(8)(9)(10)where is the mean for the variable .