Click here to load reader
Upload
phungcong
View
213
Download
0
Embed Size (px)
Citation preview
CEN 343 Chapter 7: Random Processes
Chapter 10
Random Processes
-
1
CEN 343 Chapter 7: Random Processes
1. Introduction
In chapter 2, we have defined a random variable X as a mapping of the elements of the
sample space S into points on the real axis.
For random processes, the sample space could map into a family of time functions. That is if
every outcome S is a function of time. This will lead to the concept of a random process
Definition of a random process: A random process X ( t) is am mapping of the elements of
the sample space into functions of time. Each element of the sample space is associated with
a time function as shown below.
A s s o c i a t i n g a t i m e f u n c t i o n t o e a c h e l e m e n t o f t h e s a m p l e s p a c e r e s u l t s i n a f a m i l y o f t i m e
functions called the ensemble.
2
t2t1
xk(t)
t
S
t2t1
x2(t)
t
t2t1
x1(t)
t
sk
s2s1
CEN 343 Chapter 7: Random Processes
Example 1
Consider a random processX ( t )=Acos(ωt+θ), where θ is a random variable uniformly
distributed between 0 and 2π.
Show some sample functions of this random process.
What could we say on the type function above if for example we fix the phase θ=π /4
Solution:
- Sample functions of this r.p. are cosine functions with frequency wt and different phase shifts
when t is fixed.
- Fixing θ and varying t results in a deterministic time function.
2. Classification of random processes
2.1 Continuous random process
3
CEN 343 Chapter 7: Random Processes
In this case, both X(t) and t have continuous values
2.2 Discrete random process
X ( t ) assumes a discrete set of values while time t is continues
4
x(t)
t
X(t)
t
CEN 343 Chapter 7: Random Processes
2.3 Continuous random sequence
X ( t ) assumes a continuous set of values while time t is discrete
2.4 Discrete random sequence
Both X ( t ) and t assumes a discrete set of values
5
X(t)
t
X(t)
t
CEN 343 Chapter 7: Random Processes
Fixing the time t, the random process X ( t ) becomes a random variable. In this case, the
techniques we use with random variables apply.
We can characterize a random process by the first order distribution as:
FX (x ; t)=P[X (t 0 )≤x ]
The first order density function for all possible values of t is given by:
f X(x ; t)= ddx
FX (x ; t)
The second order distribution function is the joint distribution of the two random variables
X (t 1 ) and X (t 2 ) for each t 1 andt 2. It is given by:
FX 1 X2(x1 , x2 ; t1 , t2)=P[X (t1 )≤ x1∧X (t2 )≤ x2]
The second order density function is given by:
f X1 X2(x1 , x2 ; t1 , t2)=
∂2
∂ x1 ∂ x2F X1 X 2
(x1 , x2; t 1 , t 2)
Deterministic and Nondeterministic Process:
A r.p. (random process) is called deterministic if future values of any sample functions
can be predicted from past values. For example:
Here, A ,θ ,ω0(or all) may be r.v.s.
A r.p. is called nondeterministic if future values of any sample function cannot be
predicted exactly from past values.
6
CEN 343 Chapter 7: Random Processes
3. Independence and stationarity
In many problems of interest the first and second order statistics may be necessary to
characterize a random process.
The mean value of a random process X (t ) is given by:
E [X (t)]=∫−∞
+∞
x f X (x ; t)dx
The autocorrelation function is defined by:
RXX (t 1 ,t 2 )=E [ X (t 1 ) X (t 2) ]=∫−∞
∞
∫−∞
∞
x1 x2 f X 2 X1(x1, x2; t 1 , t 2)d x1d x2
3.1 Statistical Independence:
Two random processes X(t) and Y(t) are statistically independent if the random variable group
X(t 1),….. X(tN) is independent of the group Y(t̀ 1),……… Y( `tM ¿ for any choice of t 1 ,…, tN , t̀1 ,…, `tM:
f X, Y (x1,…, xN , y1 ,…, yM ; t 1 ,…, tN , t̀ 1 ,…, `tM )
¿ f X (x1 ,…… ..xN , t1 …… ..tN ) . f Y ( y1 ,…… .. yM , t̀ 1 …….. `tM )
3.2 First-order stationary processes:
A random process is called first order stationary in the strict sense if the first-order density
function does not change with a shift in time :
f X (x1; t 1 )=f X (x1; t 1+Δ) where Δis any real number
7
CEN 343 Chapter 7: Random Processes
This means that f X (x1 , t 1) is independent of t 1 and the process mean value E [ X ( t ) ] is
constant:
E [ X ( t ) ]=∫−∞
∞
x f X ( x; t ) dx=constant
3.3 Second-order stationary process
A random process is second order stationary in the strict sense if the first and second order
density functions stratify:
f X (x1; t1 )=f X (x1; t 1+ Δ) (¿)
f X (x1 , x2; t 1 , t 2 )= f X (x1 , x2; t 1+Δ, t 2+ Δ) (**)
This means that f X is a function of time difference t 2−t 1 and not absolute time.
A second-order stationary process is also first-order stationary
For second-order stationary random process , the autocorrelation function is a function of
time differences and not absolute time :
τ=t 2−t1
RXX (t ,t+τ )=E [ X (t ) X (t+ τ ) ]=RXX (τ )
3.4 Wide sense stationary random process
As the basic conditions for strict sense stationary process are usually difficult to verify
(equations (*) and (**) )
We often resort to a weaker definition of stationarity known was wide sense stationary.
8
CEN 343 Chapter 7: Random Processes
Thus, when the autocorrelation function RXX (t 1 ,t 2 ) of the random process X ( t) varies only
with the time difference ¿ t 1−t 2∨¿ and the mean E [X (t)] is constant then X ( t) is said to
be wide sense stationary. That is:
(1) E [X (t)]≡constant
(2) RXX ( t+τ , t )=RXX (τ )
A strict sense stationary process is wide sense stationary process but the inverse is not true.
Example 2
Consider a random processX ( t )=Acos(ωt+θ), where θ is a random variable uniformly
distributed between 0 and 2π. Is the process stationary in the wide sense?
Solution:
E [X (t)]=∫0
2 π
A cos (w0 t+θ ) . 12π
dθ=0
RXX ( t ,t+τ )=E[X (t ) X ( t+τ)]
¿ E[ A cos (w0 t+θ ) Acos (w0 t+w0 τ+θ)] . We have:
¿ A2
2E[cos (w0 τ )+cos (2w0 t+w0 τ+2θ)]
¿ A2
2cos (w0 τ )+ A2
2E [cos (2w0 t+w0 τ+2θ)]
9
0
12π
θ
f θ
2π
CEN 343 Chapter 7: Random Processes
¿ A2
2cos (w0 τ )=RXX(τ)
X(t) is wide-sense stationary r.p.
3.5 Jointly wide-sense stationary:
For two random processes X(t) and Y(t), we say they are jointly wide-sense stationary if
each process is wide-sense stationary and:
RXY ( t ,t+τ )=E [ X ( t )Y ( t+τ ) ]=RXY (τ )⇒ functionof time differenceonly .
RXY (t 1 ,t 2 )=E [ X (t 1 )Y ( t2 ) ] Represents the cross-correlation function of X(t) and Y(t) .
We also define the covariance function by:
CXX (t1 , t2 )=E [{X (t 1 )−mX ( t1 )}{X (t2 )−mX (t 2) }]
The cross-covariance function is defined by:
CXY (t1 , t2 )=E [{X (t 1 )−mX ( t1 )}{Y (t 2 )−mY (t2 ) }]
3.6 N-order and strict-sense stationary:
A random process is strict stationary to order N if the N-th order density function is
invariant to time origin shift:
10
CEN 343 Chapter 7: Random Processes
f X (x1, . . .. . , x N ; t1 , . .. . . ,tN )=f X (x1 , . .. . . , x N; t 1+∆ , . . .. . , tN+∆)
for all t 1 , . . .. . , tN and ∆.
3.7 Time averages and ergodicity
The time average of a quantity [.] is defined as:
A[ .]= limT→∞
12T ∫
−T
T
[ . ] dt
Where A is used to denote the time average as E for statistical averages.
Here, we are interested in:
Mean value of a sample function:
x=A [ x ( t ) ]= limT →∞
12T ∫
−T
T
x (t )dt
Time auto correlation function:
RXX ( τ )=A [ x (t ) x (t+τ ) ]= limT →∞
12T ∫
−T
T
x (t )x (t+τ )dt
Ergodic process: A random process X(t) is said to be ergodic if the time averages x and
RXX ( τ ) equal the statistical averages X and RXX ( τ ). In other words, time averages equal the
corresponding statistical averages.
Erogodicity in the mean: A random process that satisfiesx=X, i.e, time average equals
the ensemble average, is called mean-ergodic random process.
11
CEN 343 Chapter 7: Random Processes
Erogodicity in the autocorrelation: A random process that satisfies RXX ( τ )=RXX (τ ) is
called autocorrelation-ergodic.
Example 3:
Let two random processes given by:
I (t )=X cos wt+Y sinwt
Q (t )=X coswt−Y coswt
Where X and Y are two random variables with zero mean, uncorrelated, and each has a variance
equal toσ 2 . Find the cross-correlation function between I(t) and Q(t)
Solution:
We have E[X] = E[Y] = 0
X and Y are uncorrelated ⇒E[XY] = 0
Variance= E[X2] = E[Y2] =σ 2
R IQ (t 1, t2 )=E[ I (t 1 )Q(t2)]
12
ergodic
strict stationary
Wide-sense stationaryCollection of all random process
p.s
CEN 343 Chapter 7: Random Processes
¿ E [(X cosw t 1+Y sin w t1)(Y cosw t 2−X cos w t2)]
¿E [ XY ] ( cosw t 1cos w t2−sinw t 1cos w t2 )+E [Y 2 ] sinw t1 cos w t2−E [ X2 ]cos w t1 cosw t 2
¿σ 2cos w t2(sin w t1−cosw t 1)
Let τ=t 2−t1⇒ t 2=t 1+ τ
RXY ( t ,t+τ )=σ2 cosw (t+ τ )[sin wt−cos wt ]
Example 4:
Let X(t) a random process with: X ( t )=A cos (w0t+θ)
Where A and w0 are constants and θ is random variable uniformly distributed in the interval
[0,2].
Is X(t) ergodic in the mean and autocorrelation function?
Solution:
We found that E [ X (t ) ]=0=X
RXX (τ )=A2
2cos (w0 τ )
where τ is a constant
Now, x=A [ x ( t ) ]= limT →∞
12T ∫
−T
T
A cos (w0 t+θ )dt
¿ Aw0
limT →∞
12T
sin (w0t+θ )|−T
T
=0
X=x⇒ X (t ) is ergodic in the mean.
RXX (t ,t+τ )= limT→∞
12T ∫
−T
T
A cos (w0 t+θ ) . A cos (w0(t+τ )+θ )dt
13
CEN 343 Chapter 7: Random Processes
¿ A2
2limT→∞
12T ∫
−T
T
¿¿
¿ A2
2cos (w0 τ ) lim
T→∞
12T ∫
−T
T
dt= A2
2cos (w0 τ )=RXX (τ )
R xx ( τ )=R xx (τ )⇒ X ( t ) is ergodic in the autocorrelation
4. Spectral characteristics
In the previous chapter we introduced random processes and presented the related
temporal characteristics.
In this chapter we will present another aspect for representing random processes in the
frequency domain.
In this chapter, we assume that the random processes are wide sense stationary.
5. Power spectral density
5.1 Deterministic signals
We known that the Fourier transform of a deterministic signal s(t ) is given by:
S (ω )=∫−∞
+∞
s (t)e− jωt dt
S (ω ) is called sometimes the spectrum of s (t ) .
In going for the time domain description s ( t ) to the frequency domain S (ω ) no information
about the signal is lost.
14
CEN 343 Chapter 7: Random Processes
Which means that S (ω ) forms a complete description of s ( t ) and vise-versa.
The signal s (t ) can be obtained using the inverse furrier transform:
s (t )= 12 π ∫
−∞
+∞
S (ω)e jωt dω
Or
s (t )=∫−∞
+∞
S (f )e j2πft df
5.2 Random process
If the random process X ( t ) is stationary in the wide sense, then the power spectral density
SXX (ω ) can be expressed as the Fourier transform of the autocorrelation function RXX ( τ ):
SXX (ω )=∫−∞
+∞
RXX(τ )e− jωτdτ
Or
SXX ( f )=∫−∞
+∞
RXX(τ )e− j2πfτdτ
As for deterministic signals, the autocorrelation function RXX ( τ )can be obtained from the
power spectral density SXX ( f ) using the inverse Fourier transform:
RXX ( τ )= 12π ∫
−∞
+∞
SXX(ω)e jωτdω
Or
RXX ( τ )=∫−∞
+∞
S XX( f )ej2πfτ df
15
CEN 343 Chapter 7: Random Processes
The transformation RXX ( τ )↔S XX(ω) is sometimes called the wiener-Kinchin relations.
The average power is given by:
PXX=∫−∞
+∞
SXX (f ) df
PXX=1
2 π ∫−∞
+∞
S XX (ω )dω
Note also the PXX is given by:
PXX= limT→∞
12T ∫
−∞
+∞
E[X2(t)]dt
5.3 Proprieties of the power density spectrum
The power spectrum density has several proprieties:
SXX (ω )≥0
SXX (−ω )=SXX ( ω) for X ( t) real
SXX (ω ) is real
PXX=1
2 π ∫−∞
+∞
S XX (ω )dω=⟨ E[X2(t )]⟩
Example 4
Let X ( t) be a wide sense stationary process with autocorrelation function
RXX ( τ )={A (1−|τ|T
)−T ≤ τ≤T
0 otherwise
Where T>0 and A are constants.
Determine the power spectrum density.
16
CEN 343 Chapter 7: Random Processes
Solution:
Fourier Transform of some common functions:
17