40
第第第 第第第 Monte Carlo Monte Carlo 第第 第第

第九章 Monte Carlo 积分. Monte Carlo 法的重要应用领域之一:计算积分和多重积分 适用于求解: 1. 被积函数、积分边界复杂,难以用解析方法或一般的数

Embed Size (px)

Citation preview

第九章 第九章 Monte CarloMonte Carlo 积分积分

第九章 第九章 Monte CarloMonte Carlo 积分积分Monte Carlo 法的重要应用领域之一:计算积分和多重积分适用于求解:

1. 被积函数、积分边界复杂,难以用解析方法或一般的数值方法求解;

2. 被积函数的具体形式未知,只知道由模拟返回的函数值。

本章内容:

用 Monte Carlo 法求定积分的几种方法: 均匀投点法、期望值估计法、重要抽样法、半解析法、…

第九章 第九章 Monte CarloMonte Carlo 积分积分Goal: Evaluate an integral:

b

a

I dx g(x)

Why use random methods?

Computation by “deterministic quadrature” can become expensive and inaccurate.

grid points add up quickly in high dimensions

bad choices of grid may misrepresent g(x)

第九章 第九章 Monte CarloMonte Carlo 积分积分Monte Carlo method can be used to compute

integral of any dimension d (d-fold integrals)

Error comparison of d-fold integrals

Simpson’s rule,… dNE /1

21 NE purely statistical,

not rely on the dimension !

Monte Carlo method WINS, when d >> 3

Monte Carlo method

approximating the integral of a function f using quadratic polynomials

)]()(4)([3

1)()( 210

0

00

xfxfxfhdxxfdxxfhx

x

x

x

hxxxx 1201

第九章 第九章 Monte CarloMonte Carlo 积分积分

Hit-or-Miss Method

Sample Mean Method

Variance Reduction Technique

Variance Reduction using Rejection Technique

Importance Sampling Method

Hit-or-Miss Method•Evaluation of a definite integral

dxxIb

a )(

xxh any for )(

a b

hX

X

X

XX

X

O

O

OO

O

OO

•Probability that a random point reside inside the area

N

M

hab

Ip

)(

N

MhabI )(

N : Total number of points

M : points that reside inside the region

Hit-or-Miss MethodSample uniformly from the rectangular region [a,b]x[0,h]

a)-h(bI

: p

The probability that we are below the curve is

So, if we can estimate p, we can estimate I:

a)-h(bpI ˆˆ

where is our estimate of pp

Hit-or-Miss MethodWe can easily estimate p:

throw N “uniform darts” at the rectangle

NM

pˆ let

let M be the number of times you end up under the curve y=g(x)

Hit-or-Miss Method

a b

hX

X

X

XX

X

O

O

OO

O

OO

Start

Set N : large integer

M = 0

Choose a point x in [a,b]

Choose a point y in [0,h]

if [x,y] reside inside then M = M+1

I = (b-a) h (M/N)

End

LoopN times

auabx 1)(

2huy

Hit-or-Miss MethodError Analysis of the Hit-or-Miss Method

It is important to know how accurate the result of simulations are

note that M is binomial(M,p)

)1()()( 2 PNpMNpME

IpabhMEN

abh

abhN

MEabhpEIE

)()()(

)()(ˆ)ˆ(

N

ppabhM

N

abh

abhN

MabhpI

)1()()(

)(

)()(ˆ)ˆ(

222

2

22

222

N

Mpp ˆ )1(

)()ˆ(

N

MM

N

abhI

21)1(

)()ˆ(

NN

ppabhI

第九章 第九章 Monte CarloMonte Carlo 积分积分

Hit-or-Miss Method

Sample Mean Method

Variance Reduction Technique

Variance Reduction using Rejection Technique

Importance Sampling Method

Sample Mean MethodStart

Set N : large integer

s1 = 0, s2 = 0

xn = (b-a) un + a

yn = (xn)

s1 = s1 + yn , s2 = s2 + yn2

Estimate mean ’=s1/NEstimate variance V’ = s2/N – ’2

End

LoopN times

N

VabI SM

error

'6745.0)(

Sample Mean Method

dx g(x) Ib

a

Write this as:

dxa-b

1 g(x)a)-(b dx g(x) I

b

a

b

a

g(X)E a)-(b where X~unif(a,b)

Sample Mean Method

g(X)E a)-(b I where X~unif(a,b)

So, we will estimate I by estimating E[g(X)] with

n

1i

) ˆig(X

n1

[g(X)]E

where X1, X2, …, Xn is a random sample from the uniform(a,b) distribution.

Sample Mean Method

dx e I3

0

x

Example:

(we know that the answer is e3-1 19.08554)

write this as

]3E[e dx 31

e3 I X3

0

x

where X~unif(0,3)

Sample Mean Method write this as

]3E[e dx 31

e3 I X3

0

x

where X~unif(0,3)

estimate this with

n

1i

xX ien1

3][eE3 I ˆˆ

where X1, X2, …, Xn are n independent unif(0,3)’s.

Sample Mean MethodSimulation Results: true = 19.08554,

n=100,000

1 19.10724

I

2 19.08260

3 18.97227

4 19.06814

5 19.13261

Simulation

Sample Mean MethodDon’t ever give an estimate without a confidence interval!

This estimator is “unbiased”:

n

1ii )g(X

n1

a)-(bE ]IE[

n

1ii )]E[g(X

n1

a)-(b

b

a

dx a-b

1 g(x)

n1

a)-(b b

a

dx g(x) I

Sample Mean Method]IVar[ : σ2

ˆ

n

1ii )g(X

n1

a)-(bVar

n

1ii2

2

)g(XVarn

a)-(b

(indep) )]Var[g(Xn

a)-(b

n

1ii2

2

(indent) )]Var[g(X na)-(b

1

2

b

a

22

dxa-b

1 E[g(X)]g(x)

na)-(b

b

a

22

dx a-b

1a-b

Ig(x)

na)-(b

Sample Mean Method an approximation

1n

abI

)g(x

na)-(b

s

n

1ii2

2I

ˆ

ˆ

Sample Mean Method

n

1ii )g(X

n1

a)-(b I

X1, X2, …, Xn iid -> g(X1), g(X2), …, g(Xn) iid

Let Yi=g(Xi) for i=1,2,…,n

Then

Y a)-(b I ˆ

and we can once again invoke the CLT.

Sample Mean Method

)N(I, I 2I

ˆ

For n “large enough” (n>30),

So, a confidence interval for I is roughly given by

I/2 z I ˆˆ

but since we don’t know , we’ll have to be content with the further approximation:

I

I/2 s z I ˆˆ

Sample Mean Method

b

a

2x1/2 dx e x: I

By the way…

No one ever said that you have to use the uniform distribution

Example:

0

b][a,2x1/2 dx (x)I 2e x

21

(X)I XE21

b][a,1/2

where X~exp(rate=2).

Sample Mean MethodComparison of Hit-and-Miss and Sample Mean Monte Carlo

Let be the hit-and-miss estimator of IHMI

)IVar( )IVar( SMHMˆˆ

Then

Let be the sample mean estimator of ISMI

Sample Mean MethodComparison of Hit-and-Miss and Sample Mean Monte Carlo

Sample mean Monte Carlo is generally preferred over Hit-and-Miss Monte Carlo because:

the estimator from SMMC has lower variance

SMMC does not require a non-negative integrand (or adjustments)

H&M MC requires that you be able to put g(x) in a “box”, so you need to figure out the max value of g(x) over [a,b] and you need to be integrating over a finite integral.

2.1 Variance Reduction Technique - Introduction

第九章 第九章 Monte CarloMonte Carlo 积分积分

Hit-or-Miss Method

Sample Mean Method

Variance Reduction Technique

Variance Reduction using Rejection Technique

Importance Sampling Method

Variance Reduction Technique

IntroductionMonte Carlo Method and Sampling Distribution

Monte Carlo Method : Take values from random sample From central limit theorem,

3 rule

Most probable error

Important characteristics

N/ 22

9973.0)33( XP

NError

6745.0

NError /1 Error

Variance Reduction Technique

IntroductionReducing error

*100 samples reduces the error order of 10 Reducing variance Variance Reduction Technique

The value of variance is closely related to how samples are taken

Unbiased samplingBiased sampling

More points are taken in important parts of the population

Variance Reduction TechniqueMotivation

If we are using sample-mean Monte Carlo MethodVariance depends very much on the behavior of (x)(x) varies little variance is small(x) = const variance=0

Evaluation of a integral

Near minimum points contribute less to the summationNear maximum points contribute more to the summation

More points are sampled near the peak ”importance sampling strategy”

N

nnY

xN

ababI

1

)()('

第九章 第九章 Monte CarloMonte Carlo 积分积分

1.2 Hit-or-Miss Method

1.3 Sample Mean Method

2.1 Variance Reduction Technique

2.3 Variance Reduction using Rejection Technique

2.4 Importance Sampling Method

Variance Reduction Technique

Variance Reduction for Hit-or-Miss method

•In the domain [a,b] choose a comparison function

dxxwA

dxxwxW

xxw

b

a

x

)(

)()(

)()(

)(xwAu )(1 AuWx

a b

w(x)

X

X

XX

X

O

O

OO

O

OO

(x)

Points are generated on the area under w(x) function

Random variable that follows distribution w(x)

Variance Reduction Technique

Points lying above (x) is rejected

N

NAI

'

nnn uxwy )(

)( if 0

)( if 1

nn

nnn xy

xyq

q 1 0P(q) r 1-r

AIr /

a b

w(x)

X

X

XX

X

O

O

OO

O

OO

(x)

Variance Reduction Technique

Error Analysis)1()( ,)( rrQVrQE

)(QAEArI

N

IAII RJ

error

)(67.0

Hit or Miss method

Error reduction

habA )(

0Error then IA

Variance Reduction Technique

Start

Set N : large integer

N’ = 0

Generate u1, x= W-1(Au1)

Generate u2, y=u2 w(x)

If y<= f(x) accept value N’ = N’+1Else : reject value

I = (b-a) h (N’/N)

End

LoopN times

N

IAII RJ

error

)'('67.0

a b

w(x)

X

X

XX

X

O

O

OO

O

OO

(x)

第九章 第九章 Monte CarloMonte Carlo 积分积分

1.2 Hit-or-Miss Method

1.3 Sample Mean Method

2.1 Variance Reduction Technique

2.3 Variance Reduction using Rejection Technique

2.4 Importance Sampling Method

Importance Sampling Method

Basic ideaPut more points near maximumPut less points near minimum

F(x) : transformation function (or weight function_

x

dxxfxF )()(

)(

)(1 yFx

xFy

Importance Sampling Method

)(/)(/ xfdydxxfdxdy

b

a

b

adxxf

xf

xdy

xf

xI )(

)(

)(

)(

)(

b

af dxxfx )()(

)(

)()(

xf

xx

if we choose f(x) = c(x) ,then variance will be smallThe magnitude of error depends on the choice of f(x)

f

b

adxxfxI )()(

Importance Sampling Method

Estimate of error

N

nnf x

NI

1

)(1

N

VI f

error

)(67.0

22 )()( fffV

N

II fIS

error

22

67.0

Importance Sampling Method

Start

Set N : large integer

s1 = 0 , s2 = 0

Generate xn according to f(x)

n = (xn) / f ( xn)

Add n to s1

Add n to s2

I ‘ =s1/N , V’=s2/N-I’2

End

LoopN times

N

VI IS

error

'67.0