Mle Variance

Embed Size (px)

Citation preview

  • 7/31/2019 Mle Variance

    1/4

    Bias of Maximum Likelihood Estimate of Mean

    and Variance of a Distribution

    Vikram Kamath

    1 MLE Estimates:

    We know that the Maximum Likelihood Estimate for the Mean of a Distributionis:

    =1

    n

    ni=1

    xi (1)

    And this is nothing but the sample mean.

    The Maximum Likelihood Estimate of the Variance is:

    =1

    n

    ni=1

    (xi )2 (2)

    Which is nothing but the sample variance.

    We also know that the First and Second Moments of a random variable x aregiven by:

    E[x] = and E [(x )2] = 2 respectively (3)

    2 Bias of MLE Mean

    We can find the Bias of the MLE mean by calculating the value of:

    E[]

    Which is given by:

    E[] = E[1n

    ni=1

    xi] (From (1))

    =1

    nE[

    ni=1

    xi] =1

    n(

    ni=1

    E[xi])

    =1

    n(n) ( from (3))

    = E[] =

    This shows that the MLE estimate of the mean of a distribution is not biased.

    1

  • 7/31/2019 Mle Variance

    2/4

    3 Bias of MLE Variance

    We can find the Bias of the MLE Variance by calculating the value of:

    E[2]

    Which is given by

    E[2] = E[1

    n

    ni=1

    (xi )2] (From (2))

    =1

    nE[

    ni=1

    (x2i + 2 2xi)]

    =1

    nE

    ni=1

    (x2i ) +ni=1

    (2) 2ni=1

    (xi)

    =1

    nE

    ni=1

    (x2i ) + n2 2(n)

    (from(1)) =

    1

    n

    ni=1

    xi = n =

    ni=1

    xi

    =1

    nE

    ni=1

    (x2i ) (n2)

    =1

    n

    ni=1

    E[x2i ]E[n2]

    =1

    n

    ni=1

    E[x2i ] nE[2]

    =1

    n

    ni=1

    E[x2i ]E[2]

    2

  • 7/31/2019 Mle Variance

    3/4

    x1, x2, x3,...,xn are random variables equivalent to the single random variable

    x and hence E[x2

    i ] can be replaced by E[x2

    ] above. The above hence becomes:

    =1

    n

    ni=1

    E[x2]E[2]

    =1

    nn E[x2]E[2]

    = E[x2]E[2]

    (4)

    We also know that (DIY Derivation or look it up):

    E[x2] = E[(x )2] + (E[x])2

    = E[x2] = 2 + 2(5)

    Similarly:

    E[2] = E[( )2] + (E[])2

    = E[2] = 2 + 2

    (6)

    Plugging the results of Eq(5) and Eq(6) in Eq(4), we get:

    E[2] = (2 + 2) (2 + 2)

    (7)

    We will now use the following theorem (Proof left as an exercise/DIY):Theorem: If x1, x2, x3,...,xn are n independent and identically distributedrandom variable (i.i.d), then the sum of their variances (denoted by 2x) is equalto n times the sum of their individual variances (denoted by 2) That is:

    V ar(x1 + x2 + x3 + ..., +xn) = n2

    = 2x = n2

    (8)

    We will now use another theorem (Proof left as an exercise/DIY):Theorem: If x is a random variable, then the variance of a constant multiplec of x (denoted by 2cx) is equal to c

    2 times the variance of x. That is:

    2cx = c22x

    (9)

    3

  • 7/31/2019 Mle Variance

    4/4

    Because is the sample mean i.e.

    = 1n

    ni=1

    xi

    We can say that:

    2 = V ar() = V ar(x1 + x2 + x3, ..., +xn

    n)

    Using Eq(8) and Eq(9) and solving for the above we get:

    2 = V ar(x1 + x2 + x3, ..., +xn

    n) =

    n2

    n2=

    2

    n

    = 2

    =

    2

    n (10)

    We know that that MLE mean is not biased. Using this knowledge and Eq(10)and substituting in Eq(7), we get:

    E[2] = (2 + 2) (2

    n+ 2)

    = E[2] = 2 2

    n

    = E[2] =(n 1)2

    n

    (11)

    This shows that the MLE Variance IS biased.

    4