Upload
others
View
3
Download
0
Embed Size (px)
Citation preview
Probability, Chapter 6: Jointly Distributed Random Variables
Independent Random Variables and Sums (6.2-6.3)
이상준 교수 (덕성여대 수학과) 2015년 2학기
Textbook: Sheldon Ross, A first course in probability (9th ed, Pearson)
�� ���: ��, ��(15)
1
<6.2> Independent Random Variables❖ Definition(Recall): The random variables X and Y are said to be independent if
for any two sets of real numbers A and B, P{X∈A, Y∈B} = P{X∈A} P{Y∈B} ———————— (*)
❖ Property: (*) is equivalent to P{X≤a, Y≤b} = P{X≤a}P{Y≤b} for all a, b.
❖ Fact: F(a,b) = FX(a) FY(b) for all a,b.
❖ Corollary:
1. If X and Y are discrete, (*) is equivalent to p(x,y) = pX(x) pY(y) for all x,y.
2. If X and Y are continuous, (*) is equivalent to f(x,y) = fX(x) fY(y) for all x,y.
2
❖ Example 2a: Suppose that n+m independent trials having a common probability of success p are performed.
❖ X is the number of successes in the first n trials.
❖ Y is the number of successes in the final m trials.
❖ Z is the number of successes in the n+m trials.
❖ Then, the follwing holds:
❖ X and Y are independent.
❖ In contrast, X and Z will be dependent.
3
(Reference: Ross, A first course in probability, 9th ed)
❖ Example 2b: Suppose that the number of people who enter a post office on a given day is a Poisson random variable with parameter ƛ.
❖ Each person who enters the post office is a male with probability p and a female with probability 1-p.
❖ Show that the number of males and females entering the post office are independent Poisson random variables with respective parameters ƛp and ƛ(1-p).
4
❖ Proposition 2.1: The continuous (discrete) random variables X and Y are independent if and only if their joint probability density (mass) function can be expressed as fX,Y(x,y) = h(x)g(y) -∞ < x < ∞, -∞ < y < ∞.
5
❖ Example 2f (1): The joint density function of (X,Y) is f(x,y) = 6e-2xe-3y 0 < x < ∞, 0 < y < ∞and is equal to 0 outside this region.
❖ Are X and Y independent?
❖ Solution: f(x,y) = 6e-2xe-3yI(x)J(y) = 6e-2xI(x) e-3yJ(y) where
❖ I(x) =
❖ I(y) =
7
❖ Example 2f (2): The joint density function of (X,Y) be f(x,y) = 24xy 0 < x < 1, 0 < y < 1, 0 < x+y < 1and is equal to 0 otherwise.
❖ Are X and Y independent?
❖ Solution: f(x,y) = 24xy I(x,y) where
❖ f(x,y) does not factor into a part depending only on x and another depending only on y.
8
<6.3> Sums of Independent Random Variables
❖ X and Y are independent.
❖ fX and fY are the probability density functions of X and Y.
❖ Goal: Calculate the distribution and density function of X+Y.
9
❖ Goal: Calculate the distribution and density function of X+Y.
❖ Solution:
(Reference: Ross, A first course in probability, 9th ed)
<6.3.1> Identically Distributed Uniform Random Variables
❖ Example 3a: (Sum of independent uniform random variables) X and Y are independent with X~U(0,1) and Y~U(0,1).
❖ Calculate the probability density of X+Y.
❖ Solution:
❖ For 0 ≤ a ≤ 1,
❖ For 1 < a < 2,
11
(Reference: Ross, A first course in probability, 9th ed)
❖ Example 3a: (Sum of independent uniform random variables)Calculate the probability density of X+Y.
❖ Solution:
(Reference: Ross, A first course in probability, 9th ed)
<6.3.3> Normal Random Variables❖ Proposition 3.2: (Sums of independent Normal random variables)
X and Y are independent with X=N(#1, !12) and Y=N(#2, !22).
❖ Then X + Y ~ N(#1+#2 , !12+!22).
❖ Proof: Skip!
13 (Reference: Ross, A first course in probability, 9th ed)
<6.3.4> Poisson and Binomial Random Variables❖ Example 3e: (Sums of independent Poisson random variables)
X and Y are independent with X~Poi(ƛ1) and Y~Poi(ƛ2).
❖ Compute the distribution of X+Y.
❖ Solution:
❖ Thus, X+Y ~ Poi(ƛ1+ƛ2).
14
❖ Example 3f: (Sums of independent binomial random variables) Let X and Y be independent with X ~ B(n,p) and Y ~ B(m,p).
❖ Calculate the distribution of X+Y.
❖ Solution:
❖ By Definition, X+Y ~ B(n+m,p).
15