32
機機機機 & 機機機機 Probability Limit and Convergence in Probability

機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Embed Size (px)

DESCRIPTION

機率極限 & 機率收斂 Probability Limit and Convergence in Probability. Convergence Concepts. This section treats the somewhat fanciful idea of allowing the sample size to approach infinity and investigates the behavior of certain sample quantities as this happens. We are mainly - PowerPoint PPT Presentation

Citation preview

Page 1: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

機率極限 &機率收斂

Probability Limit and Convergence in Probability

Page 2: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence Concepts

• This section treats the somewhat fanciful idea of allowing the sample size to approach infinity

• and investigates the behavior of certain sample quantities as this happens. We are mainly

• concerned with three types of convergence, and we treat them in varying amounts of detail.

Page 3: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Sequences of Random Variables(x1,x2,…,x3)

• Interested in behavior of functions of random variables such as means, variances, proportions

• For large samples, exact distributions can be difficult/impossible to obtain

• Limit Theorems can be used to obtain properties of estimators as the sample sizes tend to infinity– Convergence in Probability – Limit of an estimator– Convergence in Distribution – Limit of a CDF– Central Limit Theorem – Large Sample Distribution of

the Sample Mean of a Random Sample

Page 4: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

理論骰子平均出現點數是

1*1/6+2*1/6+…+6*1/6=21/6

骰子模擬 1000次後的平圴出現點數是

5+2+ …+3+….+6/1000-21/6

dx

dx

=

x

lim n

x

Law of Large Numbers

Page 5: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Probability

Page 6: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Probability

• The sequence of random variables, X1,…,Xn, is said to converge in probability to the constant c, if for every >0,

• Weak Law of Large Numbers (WLLN): Let X1,…,Xn be iid random variables with E(Xi)= and V(Xi)=2 < . Then the sample mean converges in probability to :

1)|(|lim

cXP nn

n

XX

XPXP

n

i in

nn

nn

1 where

1limor 0lim

Page 7: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Weak Law of Large Numbers WLLN

Page 8: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of WLLN

Prob

2

2

2

2

2

2

2

22

22

2

22

2

2

00lim||lim

1||

1 :Let

1||

1)|(|

11)|(|

)1(1

1)( :Inequality sChebyshev'

n

nXnn

XXn

XXn

XXXX

XXXX

XnXn

X

nXP

nkn

kkXP

nk

nk

nk

n

k

kn

kkXP

kkXP

kkXP

kk

kXkP

nnXVXE

Page 9: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Other Case/Rules

• Binomial Sample Proportions

• Useful Generalizations:

pp

n

pppVppE

n

X

n

Xp

pnpXVnpXEXX

ppXVpXEi

iXpnX

n

i i

n

ii

iii

Prob^

^^1

^

1

)1(,Let

)1()(,)(

)1()()(Failure a is Trial if 0

Success a is Trial if 1),(Binomial~

)1)0( provided()4

)0 provided(//)3

)2

1)

:Then and :Suppose

Prob

Prob

Prob

Prob

ProbProb

nXn

YYXnn

YXnn

YXnn

YnXn

XPX

YX

YX

YX

YX

Page 10: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Distribution

Page 11: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Distribution

• Let Yn be a random variable with CDF Fn(y).

• Let Y be a random variable with CDF F(y).

• If the limit as n of Fn(y) equals F(y) for every point y where F(y) is continuous, then we say that Yn converges in distribution to Y

• F(y) is called the limiting distribution function of Yn

• If Mn(t)=E(etYn) converges to M(t)=E(etY), then Yn converges in distribution to Y

Page 12: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Limiting Distribution

Page 13: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Example – Binomial Poisson

• Xn~Binomial(n,p) Let =np p=/n

• Mn(t) = (pet + (1-p))n = (1+p(et-1))n = (1+(et-1)/n)n

• Aside: limn (1+a/n)n = ea

limn Mn(t) = limn (1+(et-1)/n)n = exp((et-1))

• exp((et-1)) ≡ MGF of Poisson()

Xn converges in distribution to Poisson(=np)

Page 14: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Example – Scaled Poisson N(0,1)

))1,0((

!3

/

!2explim)(lim

: aslimit takingNow

!3

/

!2exp

!3

/

!2exp

!3

/

!2

//11exp)(

!3

/

!2

//111

! :Aside

1exp)(

)()(

,1

)(

)(

)(),()()(~

2/2/132

2/1322/132

2/332

2/332/

0

/)1(

)1(

2

/

NMGFett

tM

tttttt

tttttM

ttte

i

xe

eteetM

atMetM

babaXX

XV

XEXY

etMXVXEPoissonX

tY

Y

t

i

ix

tetY

Xbt

baX

eX

t

t

Page 15: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Poisson/Normal CDF Y=(X-L)/sqrt(L) L=25

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

-6 -4 -2 0 2 4 6 8

y

F(y

) Poisson CDF

Z CDF

Page 16: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Central Limit Theorem

Page 17: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Central Limit Theorem

• Let X1,X2,…,Xn be a sequence of independently and identically distributed random variables with finite mean , and finite variance 2. Then:

• Thus the limiting distribution of the sample mean is a normal distribution, regardless of the distribution of the individual measurements

n

XXN

Xnn

i i 1

Dist

where)1,0(

Page 18: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of Central Limit Theorem (I)

• Additional Assumptions for this Proof:• The moment-generating function of X, MX(t), exists in

a neighborhood of 0 (for all |t|<h, h>0).• The third derivative of the MGF is bounded in a

neighborhood of 0 (M(3)(t) ≤ B< for all |t|<h, h>0).

• Elements of Proof• Work with Yi=(Xi-)/

• Use Taylor’s Theorem (Lagrange Form)

• Calculus Result: limn[1+(an/n)]n = ea if limnan=a

Page 19: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of CLT (II)

target""our is This )()()(

1

11

)(

1)(0)( nt)(independe :Define

11

11

1

/)/(/

tMtMtM

Ynn

YnYn

Xn

XX

nY

tMeeEeeEtM

YVYEX

Y

nY

nXn

n

i i

n

i i

n

i

i

XttXt

Xt

Y

iii

i

ni i

i

i

i

i

Page 20: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of CLT (III)

xataxk

tfxr

axk

tfax

k

afaxafafxf

n

tM

n

tM

n

tM

eeEeEtM

xkx

k

k

kxk

kk

n

YYY

YntYntYnt

n

n

ni

and between strictly with )()!1(

)()( :where

)()!1(

)()(

!

)())((')()(

:form) (Lagrange Theorem sTaylor' :Aside

)(

1)(

1)1()(

)/()/()/(

1

1

Page 21: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of CLT (IV)

2/3

3232

)3()3(

22)2()2(

0

1)1()(

6210

!30

!2

1001

)assumption (Previous )()(

101)()()0()(

0)()0(')('

1)1()0()(

2,00)()(

:nApplicatioCurrent

)max()min(

)()!1(

)()(

!

)())((')()(

n

tB

n

t

n

tB

n

t

n

t

n

tM

BBtMtf

YEYVYEMaf

YEMaf

EeEMaf

kn

tta

n

txMf

a,x,a,xt

axk

tfax

k

afaxafafxf

nnY

nxYx

Y

Y

YY

xY

x

kxk

kk

Page 22: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Proof of CLT (V)

)1,0(

))1,0(()(lim

)(262

lim62

limlim

62 ere wh1lim

62

11lim

621limlim)(lim

Dist

2/

2

2/1

32

2/1

32

2/1

32

2/1

32

2/3

32

2

NXn

NMGFeetM

Bat

n

Btt

n

tBta

n

tBta

n

a

n

tBt

n

n

tB

n

t

n

tMtM

tan

n

n

n

nn

n

nn

n

n

n

n

n

n

n

n

n

n

Yn

nn

Page 23: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Asymptotic Distribution

Obtaining an asymptotic distribution from a limiting distribution

Obtain the limiting distribution via a stabilizing transformation

Assume the limiting distribution applies reasonably well in

finite samples Invert the stabilizing transformation to obtain the asymptotic

distribution

Asymptotic normality of a distribution.

d

a 2

a 2

a 2

2

n(x ) / N[0,1]

Assume holds in finite samples. Then,

n(x ) N[0, ]

(x ) N[0, / n]

x N[ , / n]

Asymptotic distribution.

/ n the asymptotic variance.

Page 24: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

趨近效率Asymptotic Efficiency

Page 25: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Asymptotic Efficiency• Comparison of asymptotic variances• How to compare consistent estimators? If both

converge to constants, both variances go to zero. – Example: Random sampling from the normal

distribution,

– Sample mean is asymptotically normal[μ,σ2/n]

– Median is asymptotically normal [μ,(π/2)σ2/n]

– Mean is asymptotically more efficient

Page 26: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Properties of MLEs: Asymptotic Normality

10 0 0

10 0

One can derive that

1n ( ) [0,{ [ ( )]} ]

which gives the asymptotic distribution of the MLE:

[ ,{ ( )} ]

d

a

N E Hn

N I

Page 27: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Properties of MLEs: Asymptotic Efficiency

iAssuming that the density of y satisfies the regularity conditions

R1-R3, the asymptotic variance of a consistent and asymptotically

normally distributed estimator o

Theorem 17.4 Cramèr - Rao Lower Bound

0

11

21 0 0 0

0 0 00 0 0 0

f the parameter vector will always

be at least as large as

ln ( ) ln ( ) ln ( )[ ( )]

'

For consistent estimators, the MLE ac

L L LE E

I

hieves the CRLB and is therefore

efficient.

Page 28: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Mean Square

Page 29: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in mean square

Page 30: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Convergence in Probability

Almost Sure

Page 31: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Almost Sure

Page 32: 機率極限 & 機率收斂 Probability Limit and Convergence in Probability

Not Almost Sure