37
《信息论与编码》 Information Theory and CodingDr. Li Chen Associate Professor, School of Information Science and Technology (SIST) Office: 631C, SIST building Email: [email protected] Web: sist.sysu.edu.cn/~chenli

Information Theory and Coding - Prof. Li CHEN … Theory and Coding-CH...Outlines Chapter 1: Fundamentals of Information Theory (2 W) Chapter 2: Data Compression: Source Coding (2

Embed Size (px)

Citation preview

《信息论与编码》 《Information Theory and Coding》

Dr. Li Chen Associate Professor, School of Information Science and Technology (SIST)

Office: 631C, SIST building

Email: [email protected]

Web: sist.sysu.edu.cn/~chenli

《Information Theory and Coding》

Textbooks:

1.《Elements of Information Theory》, by T. Cover and J. Thomas, Wiley

(and introduced by Tsinghua University Press), 2003.

2. 《Non-binary error control coding for wireless communication and data

storage》, by R. Carrasco and M. Johnston, Wiley, 2008.

3. 《Error control Coding》, by S. Lin and D. Costello, Prentice Hall, 2004.

4. 《信息论与编码理论》, 王育民、李晖著,高等教育出版社,2013.

Outlines

Chapter 1: Fundamentals of Information Theory (2 W)

Chapter 2: Data Compression: Source Coding (2 W)

Chapter 3: An Introduction of Channel Coding (2 W)

Chapter 4: Convolutional Codes and Trellis Coded

Modulation (4 W)

Chapter 5: Turbo Codes (2 W)

Chapter 6: Low Density Parity Check Codes (3 W)

Evolution of Communications

Analogue comm.

Digital comm.

Late 80s to early 90s

1G 2G 2.5G 3G 4G

Information theory and coding techniques

EE+CS

Chapter 1 Fundamentals of Information Theory

• 1.1 An Introduction of Information

• 1.2 Measure of Information

• 1.3 Average Information (Entropy)

• 1.4 Channel Capacity

• What is information?

• How do we measure information?

Let us look at the following sentences:

1) I will be one year older next year.

No information

2) I was born in 1993.

Some information

3) I was born in 1990s.

More information

Boring!

Being frank!

Interesting, so which year?

The number of possibilities should be linked to the information!

§1.1 An Introduction of Information

Throw a die once

Throw three dies

Information should be ‘additive’.

You have 6 possible outcomes.

{1, 2, 3, 4, 5, 6}

Let us do the following game:

§1.1 An Introduction of Information

We can use ‘logarithm’ to scale down the a huge amount of

possibilities.

Number (binary bit) permutations are used to represent all

possibilities.

Let us look at the following problem.

§1.1 An Introduction of Information

Finally, let us look into the following game.

§1.1 An Introduction of Information

s

Measure of information should consider the probabilities of various

possible events.

§1.1 An Introduction of Information

Observations: … …

§1.2 Measure of Information

§1.2 Measure of Information

§1.2 Measure of Information

outputs

Total amount of information = 14 bits. Is it right?

§1.3 Average Information (Entropy (熵))

Example 1.2:

§1.3 Average Information (Entropy)

Binary Entropy Function

§1.3 Average Information (Entropy)

§1.3 Average Information (Entropy)

Mutual Information of a channel

Source Channel Sink

§1.3 Average Information (Entropy)

How much we don’t

know BEFORE the

channel observations.

How much we still

don’t know AFTER the

channel observations.

§1.3 Average Information (Entropy)

Properties of mutual information

§1.3 Average Information (Entropy)

Remark: mutual information

I(X, Y) describes the amount

of information one variable X

contains about the other Y, or

vice versa as in I(Y, X).

Example 1.3: Given the binary symmetric channel shown as

0.8

0.8

0.2

0.2

0 0

1 1

Please determine the mutual information of such a channel.

Solution:

˗ Entropy of the binary source is

§1.3 Average Information (Entropy)

§1.3 Average Information (Entropy)

• Hence, the conditional entropy is:

• The mutual information is:

)0|1(

1log)0,1(

)0|0(

1log)0,0()|( 22

yxPyxP

yxPyxPYXH

§1.3 Average Information (Entropy)

)1|1(

1log)1,1(

)1|0(

1log)1,0( 22

yxPyxP

yxPyxP

bits/sym644.0

90.0

1log56.0

10.0

1log06.0

37.0

1log14.0

63.0

1log24.0 2222

bits/sym237.0)|()(),( YXHXHYXI

§1.4 Channel Capacity

Source Channel Sink

C defines the best capacity that a channel can convey the unknown information.

§1.4 Channel Capacity

Channel Capacity for Binary Symmetric Channel (BSC)

-

§1.4 Channel Capacity

pyP

yxPyxP

pyP

yxPyxP

yPppyP

)1(

)1,0()1|0(

1)0(

)0,0()0|0(

)1(2

1

2

1)1(

2

1)0(

§1.4 Channel Capacity

Mutual information function of BSC

I(x, y) = 1 + plog2p + (1-p)log2(1-p)

§1.4 Channel Capacity

Channel Capacity for Binary Erasure Channel (BEC)

0 0

1 1

)1(2

1)0()0|0()0,0( pYPYXPYXP

)1(2

1)1()1|1()1,1( pYPYXPYXP

peYPeYXPeYXP2

1)()|0(),0(

peYPeYXPeYXP2

1)()|1(),1(

§1.4 Channel Capacity

§1.4 Channel Capacity

Mutual information function of BEC

I(x, y) = 1 - p

§1.4 Channel Capacity

Channel Capacity for AWGN Channel

§1.4 Channel Capacity

bits/sec

)1(log

)1(log2

1

2

2

SNRB

SNRB

bits/sec (For 1-dim. signal, yi, xi and ni are real numbers)

bits/sec (For 2-dim. signal, yi, xi and ni are complex numbers)

§1.4 Channel Capacity

- How to realize the channel capacity in a practical communication system?

- With the existence of channel impairments, how can we secure the recovery

of the transmitted data?

- The use of channel codes: map an arbitrary k bits information into n bits

codeword and n > k. The introduced redundancy (n – k bits) can correct the

error brought by the channel. We call r = k/n as the code rate.

§1.4 Channel Capacity

- In a general communication system that employs a channel code of rate r

and a modulation scheme of order m, we need

r · m < C

- Es – average symbol energy, Ec – average coded bit energy,

Eb – average information bit energy, their relationships are

Es = m·Ec, Ec = Eb·r

- To secure a reliable communication, we need

- Hence, with , reliable communication is possible.

rmmrN

Eb )1(log0

2

§1.4 Channel Capacity

rmN

Es )1(log0

2

mrN

E mrb 12

0

§1.4 Channel Capacity

Now, we are ready to materialize a communication system:

Source

data

Source

coding

Channel

coding Modulation

Estimation of

source data

Demodu-

laiton

Channel

decoding

Source

decoding

CH

- Let us examine the performance of various coded communication systems using a

rate half (r = 0.5) channel code and QPSK (m =2).

§1.4 Channel Capacity

BER Performance of Channel Codes

1.0E-06

1.0E-05

1.0E-04

1.0E-03

1.0E-02

1.0E-01

1.0E+00

-4 -2 0 2 4 6 8 10

SNR (dB)

BE

R

uncoded

Convolutional (4 states), BCJR

RS (63, 31), BM

RS (63, 31), KV

Herm (64, 32), Sakata

Herm (64, 32), KV

Turbo (4 states), 20 Iterations

The capacity limit