32
บบบบบ 2 Quick Review about Probability and Introduction to Decision Theory (ML & MAP)

บทที่ 2 Quick Review about Probability and Introduction to Decision Theory (ML & MAP)

Embed Size (px)

DESCRIPTION

บทที่ 2 Quick Review about Probability and Introduction to Decision Theory (ML & MAP). Quick Review about Probabilty. Joint Probability. Conditional Probability. Bayes Rule :. Mutually Exclusive Event : A  B= . Statistical Independent :. “Probability” - PowerPoint PPT Presentation

Citation preview

Page 1: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

บทท�� 2 Quick Review about

Probability

and

Introduction to Decision Theory (ML & MAP)

Page 2: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Quick Review about Probabilty

N

NAP A

N lim)(

Joint Probability

)()()(

lim),(

)()()()(),(

BPAPAUBPN

NBAP

AUBPBPAPBAPBAP

BA

N

Conditional Probability

)(

),()|(

AP

BAPABP

Page 3: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Bayes Rule :

)()|()()|(),( BPBAPAPABPBAP

 Mutually Exclusive Event : AB=

Statistical Independent :

)().(),(

)().(),(

)()|(

yfxfyxf

BPAPBAP

APBAP

Page 4: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

“Probability”Measure numerically the outcome of random experiment”

“Random Variables”Assign the rule or mapping by means of which a real number

is given to each outcome

“CDF” : Cumulative Distribution Function

)()( xXPxF

“Pdf” : Probability Density Function

dx

xdFxf

)()(

Page 5: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)
Page 6: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Marginal pdf:

)()|(

),()(

)().|(),(

)()|(),()(

)()|(),()(

1

1

i

N

ii

N

ii

iii

SPSerrorP

SerrorPerrorP

SPSerrorPSerrorP

dxxfxyfdxyxfyy

f

dyyfyxfdyyxfxx

f

Page 7: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)
Page 8: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)
Page 9: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

• จำ��นวนว�นทั้��งหมด 365x10 ว�น 10ปี� = N• จำ��นวนว�นทั้��ม�ฝนตก 1650 ว�น = NR

• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต� 360 ว�น = NA

• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต�และฝนตก 240 ว�น = NAR

• ---------------------------------------------------------------------------• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต�และฝนไม�ตก ? ว�น = NAR’

• จำ��นวนว�นทั้��เก�ดไม�เก�ดอุ�บั�ต�เหต�และฝนตก ? ว�น = NA’R

• จำ��นวนว�นทั้��ไม�เก�ดอุ�บั�ต�เหต�และฝนไม�ตก ? NA’R’

• คำ��ถ�มจำงห� Joint Prob• Marginal Prob • Conditional Prob

Page 10: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

• จำ��นวนว�นทั้��งหมด 365x10 ว�น 10ปี� = N

• จำ��นวนว�นทั้��ม�ฝนตก 1650 ว�น = NR

• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต� 360 ว�น = NA

• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต�และฝนตก 240 ว�น = NAR

• -------------------------------------------------------------------------------------------------------

• จำ��นวนว�นทั้��เก�ดอุ�บั�ต�เหต�และฝนไม�ตก ( -360240) ว�น = NAR’

• จำ��นวนว�นทั้��เก�ดไม�เก�ดอุ�บั�ต�เหต�และฝนตก (1650- 240)ว�น = NA’R

• จำ��นวนว�นทั้��ไม�เก�ดอุ�บั�ต�เหต�และฝนไม�ตก (3650-360) – (1650-240) NA’R’

• คำ��ถ�มจำงห� Joint ProbMarginal Prob Conditional Prob

Page 11: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

0.3 0.2 0.3

0.1 0.1

-6 -3 0 3 6

A=(x 3.5) Find P(A)B=( x 0) Find P(B)

Find P(AB)

Example 1.

Page 12: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

S R

1

0

1

0.9

0.9

0.1

0.1

P(S=0) =0.6

P(S=1) =0.4

Example 2.

P(R=0|S=0)=0.9

P(R=1|S=0)=0.1

Find P(S=0|R=0)

Page 13: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

S R

1

0

2

0.8

0.1

0.1

Example 3.

P(S=0)=0.4

P(S=1)=0.2

P(S=2)=0.4Find P(R=0)

Example 4.

f(x,y) = 6exp(-2x-3y); x 0, y 0

= 0 ; otherwise

Find P(x 1,y 1)

Page 14: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Joint Moment

 

y

y)(x, )(

E(y))]-E(x))(y-E[(x :var

)( :

),()(

x

xy

kjkj

coventn CoefficiCorrelatio

ianceCo

xyEnCorrelatio

dxdyyxfyxyxE

 

Uncorrelate iff cov =0 Orthogonal iff E(xy) = 0

 Statistical Independent Implies Uncorrelate

Reverse is not trueUncorrelate does not implies Statistical Independent

Page 15: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Joint Gaussian Random Variables

Random Variables X1,X2,X3,..,Xn are jointly gaussian

if their joint PDF is 

)])((||2

1exp[

||2

1

),...,,(

2

21

jjii j

iijn

n

mxmxK

K

xxxp

ij = cofactor for the elememt ij

|K| = determinant of matrix K

nnnn

n

n

K

...

...

...

21

22221

11211

)])([( jjiiij mxmxE

Page 16: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Gaussian Bivariate Distribution

Bivariate Gaussian PDF

21

2211

22

2

22

21

2211

12

2

11

22

21

21

)])([(

t Coefficienn Correlatio

])())((

2)(

[)1(2

1exp

12

1

),(

mxmxE

mxmxmxmx

xxf

Page 17: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Importance of Gaussian PDF

1.  Central Limit Theorem2 . Simplify Math

Property I.Completely specfied by first and second moment

 

Property IIUncorrelate implied Statistical Independent

Others PDF not implied

)()....().().(

})(

exp{2

1),...,,(

321

2

2

2

n

1i21

n

i

ii

in

xpxpxpxp

mxxxxp

 

Page 18: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Property III

If Joint PDF is Gaussian

       Marginal PDF, p(xi) is Gaussian Conditional PDF is Gaussian

Property IV

Linear combinations of joint Gaussian RV’s

are also Gaussian

Property V

The Input Signal into a linear system is Gaussian the output signal will also be Gaussian

Page 19: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

f xx m

( ) exp{( )

} 1

2 2

2

2

P z Q z dz

( ) ( ) exp{( )

}

1

2 2

2

dxmx

amxPam

}2

)(exp{

2

1)(

2

2

P a da

( ) exp{( )

}

1

2 2

2

2

Gaussian Distribution

 

Let = x-m d = dx

if X=m+a = a

 

Page 20: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

aif

ddLet

Pa

d

Qa

a

( ) exp{( )

}

)

1

2 2

2

= (

xm m+aa0

0a

Page 21: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

P nn

dn

n

dn

Q

( ) exp{( )

}

exp{( )

}

)

21

2 2

1

2 2

2

2

2

2

2

= (2

3

EXAMPLE 5.

ส่�งส่�ญญ�ณ x ขน�ด 5 V. ผ่��น ช่�อุงส่�ญญ�ณ AWGN ทั้��ม� =3 จำงห� P(x >7) x = 5+n

Page 22: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Properties :1 1

2 1 2

2

2 2

. ( )

( ) ( )

)

.

( ) ( )

)

) )

P x

P n P n

Q

Symmetry

P n P n

Q

Q Q

= 1- (-2

= (2

(2

1- (-2

Page 23: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Functional Transformation of Random VariablesIf PDF fx(x) of random variable x : is known

Transformation Random variable y = h(x)Find PDF fy(y)

Warning : input x(t) through a linear system h(t) or H(f) Output y(t) has power spectrum density (PSD)

Syy(f) = |H(f)|2Sxx(f)

)(

)()(

1

1

yi

hi

xdxdy

xfyf

M

i

xy

Page 24: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Example 6. y = x2

)2

exp(2

1)(

2

2

x

xPx

dy

dxxPyP

xdx

dy

dy

dxxPyP

dyyPdxxP

dyyPdxxPdxxP

xy

xy

yx

yxx

)(2)(

2

)(2)(

)()(2

)()()(

Page 25: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

0; )2

exp(1

2

1

)2

exp(1

2

1)(

2

2

2

2

yy

y

x

xyPy

y

Px(x) Py(y)

Page 26: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Example 7.

)cos(

A][-A, amplitudey

],[-over on distributi uniform : RV is

)sin()(

xAdx

dy

x

xAxhy

otherwise ; 0

; |)cos(|

)(

|)cos(|

)()(

)(sin

2

2

1

1

12

1 1

AyAxA

xf

xA

xfyf

xxA

yx

xxy

Page 27: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

otherwise; 0

; ||

1

||21

||21

)(

22

2222

AyAyA

yAyAyf y

Page 28: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Maximum Likelihood Detection

Ex 8. a {0,1} P(a=0) =1/2 P(a=1)=1/2

n {0,1} P(n=0) =1/2 P(n=1) =1/2

a y=a+n Given a= 0; P(y=0|a=0) = 1/2

n P(y=0|a=1) = 0

a y P(y=2|a=1) = 1/2 = P(n=1)

0 0 P(y=2|a=0) = 0

1 1 P(y=1|a=0) = 1/2 = P(n=1)

2 P(y=1|a=1) = 1/2 = P(n=0)

n=0

n=0n=1

n=1

Page 29: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Ex 9. a {0,1} P(a=0) =3/4 P(a=1)=1/4

n {0,1} P(n=0) =1/2 P(n=1) =1/2

Only y is observable

P(y=0,a=0) = P(y=0|a=0)P(a=0) = 3/8 = P(a=0|y=0)P(y=0)

P(y=0,a=1) = P(y=0|a=1)P(a=1) = 0 = P(a=1|y=0)P(y=0)

if observe y =0 what can you guess about a ?

P(y=1,a=0) = P(y=1|a=0)P(a=0) = 3/8 = P(a=0|y=1)P(y=1)

P(y=1,a=1) = P(y=1|a=1)P(a=1) = 1/8 = P(a=1|y=1)P(y=1)

if observe y =1 what can you guess about a ?

P(y=2,a=0) = P(y=2|a=0)P(a=0) = 0 = P(a=0|y=2)P(y=2)

P(y=2,a=1) = P(y=2|a=1)P(a=1) = 1/8= P(a=1|y=2)P(y=2)

Page 30: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

ในต�วอุย่��งทั้�� 8 ถ*�รั�บั y ได*เทั้��ก�บั 1 น�กศึ-กษ�คำ�ดว�� a ทั้��ส่�งเทั้��ก�บัเทั้��ไรัในต�วอุย่��งทั้�� 9 ถ*�รั�บั y ได*เทั้��ก�บั 1 น�กศึ-กษ�คำ�ดว�� a ทั้��ส่�งเทั้��ก�บัเทั้��ไรั

P(y=1,a=0) = 3/8 = P(a=0|y=1)P(y=1)

P(y=1,a=1) = 1/8 = P(a=1|y=1)P(y=1)

เน/�อุงจำ�ก P(y=1) เทั้��ก�นทั้��งส่อุงบัรัรัทั้�ด ด�งน��นก�รัเปีรั�ย่บัเทั้�ย่บั

เม/�อุรั�บั y เข*�ม�แล*วม�คำ��นวณ P(y=1|a=0), P(y=1|a=1), P(y=1|a=2)…..คำ�� a ต�วใดทั้��คำ�� Prob ส่0งส่�ด เรั�ก1จำะคำ�ดว�� ด*�นส่�ง ส่�ง a ต�วน��น

เม/�อุรั�บั y เข*�ม�แล*วม�คำ��นวณ P(a=0| y=1), P(a=1| y=1), P(a=2|y=1)…..คำ�� a ต�วใดทั้��คำ�� Prob ส่0งส่�ด เรั�ก1จำะคำ�ดว�� ด*�นส่�ง ส่�ง a ต�วน��น

P(y,a) จำ-งเหม/อุนก�บัก�รัเปีรั�ย่บัเทั้�ย่บั P(a|y)Maximum Likelihood (ML)

Maximum Posterior Probability (MAP)

Page 31: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Ex 10. a {0,1} P(a=0) =1/2 P(a=1)=1/2

n ~ N(0,2) y = a+n f(y|a=0) = ?

f(y|a=1) = ?

0 0.5 1 y

Decision Region : y < 0.5 a = 0

y > 0.5 a = 1

P(error)=P(error|a=0)P(a=0)+P(error|a=1)P(a=1)

P(y>0.5|a=0)

= P(error|a=0)

P(y<0.5|a=1)

= P(error|a=1)

Page 32: บทที่ 2 Quick Review about Probability and  Introduction to Decision Theory (ML & MAP)

Homework :

1 .พิ�ส่0จำน3ว�� MAP = ML เม/�อุ P(ai) ม�ล�กษณะ equally likely

2. พิ�ส่0จำน3ว�� MAP เปี4น Optimal detector ทั้��ให*P(error) น*อุย่ทั้��ส่�ด