The Cramer Rao Lower Bound for The Geometric Distribution
Introduction
Suppose that we have a unbiased estimator of a parameter of a distribution. The Cramer-Rao inequality provides us with a lower bound for the variance of such an estimator. After we compute the Cramer-Rao Lower Bound (CRLB), we can check whether the unbiased estimator at hand reaches a minimum variance over all the unbiased estimators of the parameter.
The following is the Cramer-Rao Inequality.
Let be an unbiased estimator of which is a parameter of the distribution with probability density function and log-likelihood . Then the Cramer-Rao Inequality is given by:
where the function known as Fisher Information is given by , and is the sample size over which the estimator is estimated.
Thus the Cramer-Rao Lower Bound for the variance of is given by: , that is, .
In this article, we will derive the Cramer-Rao Lower Bound of an estimator of the parameter of the geometric distribution.
The Geometric Distribution
The probability density function of the geometric distribution (having parameter ) is given by:
.
Using standard probability theory, it follows that for the geometric distribution .
The Cramer-Rao Lower Bound for the Geometric Distribution
Consider first the Fisher Information for the Geometric Distribution with parameter .
Thus, the Cramer-Rao Lower Bound for the variance of an estimator of of the geometric distribution is given by: