ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم 93-92...

Preview:

Citation preview

1

(40-883ارتباطات داده )

فرآیندهای تصادفی92-93نیمسال دّو�م

یار افشین هم�ت

دانشکده مهندسی کامپیوتر

2

Random Process

• Introduction• Mathematical Definition• Stationary Process• Mean, Correlation, and Covariance• Ergodic Process• Random Process through LTI Filter• Power Spectral Density• Gaussian Process• White Noise• Narrowband Noise

3

Introduction

• Deterministic Model– No uncertainty about time-dependent behavior at

any instant of time• Stochastic (Random) Model– Probability of a future value lying between two

specified limits– Example:

Received signal = Information-bearing signal + Inference + channel

noise

4

Mathematical Definition (1)

– Each outcome of the experiment is associated with a “Sample point”

– Set of all possible outcomes of the experiment is called the “Sample space”

– Function of time assigned to each sample point:X(t,s), -T ≤ t ≤ T 2T: total observation interval

– “Sample function” of random process: xj(t) = X(t, sj)– Random variables: {x1(tk),x2(tk), . . . ,xn(tk)} = {X(tk, s1),X(tk, s2), . . . ,X(tk, sn)}

5

Mathematical Definition (2)

An ensemble of sample functions

6

Mathematical Definition (3)

–Random Process X(t): “An ensemble of time functions together with a probability rule that assigns a probability to any meaningful event associated with an observation of one of the sample functions of the random process”

•For a random variable, the outcome of a random experiment is mapped into a number.•For a random process, the outcome of a random experiment is mapped into a waveform that is a function of time.

7

Stationary Process (1)

• Strictly Stationary FX(t1+τ), . . . ,X(tk+τ)(x1, . . . , xk) = FX(t1), . . . ,X(tk)(x1, . . . , xk)

(F is joint distribution function)

“ A random Process X(t), initiated at time t=-∞, is strictly stationary if the joint distribution of any set of random variables obtained by observing the random process X(t) is invariant with respect to the location of the origin t=0.”

8

Stationary Process (2)

• Strictly Stationary FX(t1+τ), . . . ,X(tk+τ)(x1, . . . , xk) = FX(t1), . . . ,X(tk)(x1, . . . , xk)

(F is joint distribution function)

1) K = 1: FX(t+τ)(x) = FX(t)(x) = FX(x) for all t and τ

First-order distribution function of a stationary process is independent of time.

2) K = 2 & τ = -t1: FX(t1),X(t2)(x1,x2) = FX(0), X(t2-t1)(x1,x2) for all t1 and t2

Second-order distribution function of a stationary process depends only on the time difference between observation times.

9

Stationary Process (3)

Example :

10

Mean

• Mean “Expectation of the random variable by observing the process at some time t”

μX(t) = E[X(t)] = ∫ xfX(t)(x)dx

fX(t)(x) is the first –order probability density function of the process.

The mean of a strictly stationary process is a constant:

μX(t) = μX for all t

11

Correlation

• Correlation “Expectation of the product of two random variables X(t1) ,

X(t2) , by observing the process X(t) at times t1 and t2”

RX(t1,t2) = E[X(t1)X(t2)] = ∫ ∫ x1x2fX(t1),X(t2)(x1,x2)dx1dx2

fX(t1),X(t2)(x1,x2) is the second –order probability density function of the process.

• Autocorrelation of a strictly stationary process:

RX(t1,t2) = RX(t2 - t1) for all t1 and t2

12

Covariance

• AutocovarianceCX(t1,t2) = E[(X(t1)-μX)(X(t2)-μX)] = RX(t2 - t1) – μ2

X

Points:1) The mean and autocorrelation functions only

provide a partial description of the distribution of a random process.

2) The conditions of the equations for Mean and Autocorrelation are not sufficient to guarantee the random process X(t) is strictly stationary.

13

Autocorrelation PropertiesRX(τ) = E[(X(t+τ)X(t)] for all t

1) RX(0) = E[X2(t)] (mean-square value of process)

2) RX(τ) = RX(-τ) (even function of τ)

3) ІRX(τ)І ≤ RX(0) (maximum magnitude at τ=0)

E[(X(t+τ)±X(t))2] ≥ 0E[X2(t+τ)] ± 2E[X(t+τ)X(t)] + E[X2(t)] ≥ 02 RX(0) ± 2RX(τ) ≥ 0 -RX(0) ≤ RX(τ) ≤ RX(0)

14

Autocorrelation Example 1

15

Autocorrelation Example 2 (1)

16

Autocorrelation Example 2 (2)

17

Cross-Correlation (1)

Correlation Matrix :

X(t) and Y(t) stationary and jointly stationary

Cross-correlation is not even nor have maximum at origin but have symmetry:

18

Cross-Correlation (2)

19

Ergodic Process (1)

DC value of x(t):

Mean of process X(t)

Ergodic in Mean

20

Ergodic Process (2)

Time-averaged Autocorrelation

Ergodic in Autocorrelation

Note: Computing Time-averaged Mean and Autocorrelation, requires that the process be stationary.

21

Random Process through LTI Filter

22

Power Spectral Density (1)

(Power Spectral Density)

23

Power Spectral Density (2)An example:

24

Power Spectral Density (3)

(1)

(2)

(3)

(4)

(5)

PROPERTIES

(Probability Density Function)

25

Power Spectral Density (4)

26

Power Spectral Density (5)

27

Power Spectral Density (6)

28

Power Spectral Density (7) Fourier transformable

(Periodogram)

29

Power Spectral Density (8)

Cross-Spectral Densities:

Cross-Correlations:

< -- >

30

Gaussian Process (1)

Linear Functional of X(t):

Normalization

Probability Density Function

Gaussian Distribution:

31

Gaussian Process (2)Xi, i =1,2, . . . , N, is a set of random variables that satisfy:1) The Xi are statistically independent.2) The Xi have the same probability distribution with

mean μX and variance σ2X.

(Independently and identically distributed (i.i.d.) set of random variables)

Normalized variable:

Defined variable:

The central limit theorem states that the probability distribution of VN approaches a normalized Gaussian distribution N(0,1) in theLimit as the number of random variables N approaches infinity.

32

Gaussian Process (3)Property 1:

33

Gaussian Process (4)Property 2:

34

Gaussian Process (5)Property 3:

Property 4:

35

Noise (1)Shot Noise arises in electronic devices such as diodes and transistors because of the discrete nature of current flow in these devices.

h(t) is waveform of current pulseν is the number of electrons emitted between t and t+t0

>> Poisson Distribution

36

Noise (2)Thermal Noise is the electrical noise arising from the random motion of the electrons in a conductor.

37

Noise (3)White Noise is an idealized form of noise for ease in analysis.

Te is the equivalent noise temperature of a system if defined as the temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of a noiseless version of the system, it produces the same available noise power at the output of the system as that produced by all the sources of noise in the actual system.

38

Noise (4)• According to the autocorrelation function, any two different samples of white noise, no matter how closely together in time they are taken, are uncorrelated.

• If the white noise is also Gaussian, then the two samples are statistically independent.

• White Gaussian noise represents the ultimate in randomness.

• White noise has infinite average power and, as such, it is not physically realizable.• The utility of white noise process is parallel to that of an impulse function or delta function in the analysis of linear systems.

39

Noise (5)

40

Noise (6)

41

Narrowband Noise (1)In-phase and Quadrature components:

Properties: 1) Both components have zero mean.2) If narrowband noise is Gaussian, then both components are jointly Gaussian.3) If narrowband noise is stationary, then both components are jointly stationary.4) Both components have the same power spectral density:

5) Both components have the same variance as narrowband noise.6) The cross-spectral density of components is purely imaginary:

7) If the narrowband noise is Gaussian and its power spectral density is symmetric about the mid-band frequency, then the components are statistically independent.

42

Narrowband Noise (2)In-phase and Quadrature components

43

Narrowband Noise (3)Envelope and Phase components:

>> Uniform Distribution

>> Rayleigh Distribution

44

Narrowband Noise (4)

Rayleigh Distribution

Normalized form >>

45

Sine-Wave plus Narrowband Noise (1)

>> Rician Distribution

46

Sine-Wave plus Narrowband Noise (2)

Rician Distribution

Normalized form >>

Recommended