View
234
Download
5
Embed Size (px)
Citation preview
資訊理論資訊理論
授課老師授課老師 : : 陳建源陳建源Email:[email protected]:[email protected]
研究室研究室 :: 法法 401401
網站 網站 http://www.csie.nuk.edu.tw/~cychen/http://www.csie.nuk.edu.tw/~cychen/
Ch4: Channel
Ch4: Channel4. 1 Introduction
The input alphabet of the channel is the output alphabet of the coder
Source Coder Channel Decoder Recipient
Communication system
radio, optical fibre
The output alphabet of the channel is the input alphabet of the decoder
The output alphabet of the channel may not the same as its the input alphabet
Ch4: Channel4. 1 Introduction
A noisy channel is characterized by the probability that a given output letter stems from an input letter
Channel
noiseless
Memory : the output letter depends upon a sequence of input letters
a bChannel
noisy
a b1, b2,…
Ch4: Channel4. 2 Capacity of a memoryless channel
A memoryless channel is completely specified by giving P(bs|ak), s=1,…,r, k=1,…,n.
Channel
The probability of the output letter bs
Aai
Bbi
1)|P(br
1s
ska
transition probability
)P()|P(b)P(bn
1kss kk aa
n
1kss )P(b)P(b ka
Ch4: Channel4. 2 Capacity of a memoryless channel
If the transition probabilities are fixed , only the input probabilities can be manipulated.
Mutual information between the input and output
n
k
kk a
ala
1k s
sr
1ss )P()P(b
)P(bog)P(bB)I(A,
n
kk
ala
1k s
sr
1ss )P(b
)|P(bog)P(b
Ch4: Channel4. 2 Capacity of a memoryless channel
The maximum being taken over all possible input probabilitiesp1,p2, …pn while the transition probabilies P(bs|ak) are held fixed.
Def: The capacity C of a memoryless channel I defined by
B)maxI(A,C
已知1)P(
n
1k
ka
n
kk
ala
1k s
sr
1ss )P(b
)|P(bog)P(bB)I(A,
maximum
Lagrange’s multiplier
Ch4: Channel4. 2 Capacity of a memoryless channel
已知1)P(
n
1k
ka
n
1k
)P(-B)I(A, ka
maximum
Lagrange’s multiplier
偏微分得到
loge)P(b
)|P(bog)|P(b
s
sr
1ss
kk
ala
logeC k allfor 0)P( if ka
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2a
)();( 2211 aPpaPp
0 0
1 1
1-ε
1-ε
ε
ε
Aa1=0a2=1
Bb1=0b2=1
令
kk pa )P(
n
1k
)P(-B)I(A, ka對 之 偏微分
2
1sk
2
1k s
s2
1sks )P(b
)|P(bog)|P(b p
alpa k
k 即對
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2a
kk pa )P(之 偏微分得到
2
1sk
2
1k s
s2
1sks )P(b
)|P(bog)|P(b p
alpa k
k
-loge)P(b
)|P(bog)|P(b
s
s2
1ss
kk
ala =0
得到logeC
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2a
)P(b
)|P(bog)|P(b
s
s2
1ss
kk
ala
logeC
)P(b
log)P(b
-1log-1
21
當 p1=1/2, p2=1/2
-1log-1log1
log-1-1log-11/2
log1/2
-1log-1
1/2)P(b1/2,)P(b 21
K=1
-1log-1log1C
K=2 亦同
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2b0 0
1 1
1-ε
1-ε
ε
ε
Aa1=0a2=1
Bb1=0b2=1b3=2
2
-loge)P(b
)|P(bog)|P(b
s
s3
1ss
kk
ala =0
得到 logeC
已知
)P(b
)|P(bog)|P(b
s
s3
1ss
kk
ala
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2b0 0
1 1
1-ε
1-ε
ε
ε
Aa1=0a2=1
Bb1=0b2=1b3=2
2
得到
1log
12
11
og1 lC
-12
1)P(b,)P(b,-1
2
1)P(b 321
log1
2
11
og1
lK=1 K=2 亦同
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2c0 0
1
1
1
1-ε
1/2
1/2
Aa1=0a2=1a3=2
Bb1=0b2=1
2
-loge)P(b
)|P(bog)|P(b
s
s2
1ss
kk
ala =0
得到 logeC
已知
)P(b
)|P(bog)|P(b
s
s2
1ss
kk
ala
Ch4: Channel4. 2 Capacity of a memoryless channel
Example 4.2c0 0
1
1
1
1-ε
1/2
1/2
Aa1=0a2=1a3=2
Bb1=0b2=1
2
得到 1C
2
1)P(b,
2
1)P(b 21
)P(b
)|P(bog)|P(b
s
s2
1ss
kk
ala
K=11
2
11
og1 l K=2 02/1
2/1og
2
1
2/1
2/1og
2
1 ll K=3
1
2
11
og1 l
Ch4: Channel4. 3 Convexity
Theorem 4.3a The mutual information is a concave function of the input probability, i.e.
.10for
)p)1(pI())I(p-(1)I(p (1)(0)(1)(0)
Ch4: Channel4. 3 Convexity
Theorem 4.3b I(p) is a maximum (equal to the channel capacity) if, and only if, p is such that
.0pfor which k every for I(p)p
(b)
.0pfor which k every for I(p)p
(a)
kk
kk
Ch4: Channel4. 5 Uniqueness
Theorem 4.5a The output probabilities which correspond to the capacity of the channel are unique.
Theorem 4.5b In an input which achieves capacity with the largest number of zero probabilies, the non-zero probabilities are determined uniquely and their number does not exceed the number of output letters.
Ch4: Channel練習
0 0
1 1
1
1
Aa1=0a2=1
Bb1=0b2=1
Noiseless binary channel
C=1 bit
Noise channel with nonoverlapping output
01
1
2
1/2
2/3
1/2
1/3
Aa1=0a2=1
Bb1=1b2=2b3=3b4=4
3
4
C=1 bit
Ch4: Channel練習
Noisy typewriter
0 0
1 1
1/2
1/2
1/21/2A
a1=0a2=1a3=2a4=3
Bb1=0b2=1b3=2b4=3
2
3
C=1 bit2
3
1/2
1/2
1/2
1/2
26 字母 C=log13 bits
Ch4: Channel練習
Binary symmetric channel
I(A,B)=H(B)-H(B|A) ≦ 1-H(B|A)
0 0
1 1
1-ε
1-ε
ε
ε
Aa1=0a2=1
Bb1=0b2=1
-1log-1log1
Ch4: Channel練習
Binary erasure channel
I(A,B)=H(B)-H(B|A)
0 0
1
1
1-ε
1-ε
ε
ε
Aa1=0a2=1
Bb1=0b2=1b3=2
)|()1)(1log()1)(1(log)1(log)1(p ABHppp
2
p=1/2 I(A,B)=1-ε
Ch4: Channel練習
Z channel
0 0
1 1
1
1/2
1/2Aa1=0a2=1
Bb1=0b2=1
epp
e
log)1(
2/1log
2
1
)0(
2/1log
2
1
logp(0)
11log
0)1(log
2
1)0(log
2
1-1 pp
1)1()0( and )1(4)0( pppp
2-log5C 4/5)0( p
Ch4: Channel4. 6 Transmission properties
I(A, B)=H(A)-H(A|B)
Shannon’s theorem I.If H(A) C, there is a code such that transmission over the channel is ≦possible with an arbitrarily small number of errors, i.e. the equivocation is arbitrarily
Shannon’s theorem II.If H(A) > C, there is no code for which the equivocation is less than H(A)-C but there is one for which the equivocation is less than H(A)-C+ε where ε is an arbitrary positive quantity.
the equivocation: measure of the uncertainty as to what was sent when observations are made on the output and so assesses the effect of noise during transmission.
Ch4: Channel4. 7 Channels in cascade
Channel 1 Channel 2A B C
)P()|P(b)P(bn
1kss kk aa
)P()|P(bP(b) aa
)P()|P(cP(c) bb
)P()|P(b)|P(cP(c) aab
Ch4: Channel4. 7 Channels in cascade
0 0
1 1
3/4
3/4
1/4
1/4
0
1
3/4
3/4
1/4
1/4
A BC
5/8 3/8
3/8 5/8
10/16 6/16
6/16 10/16
3/4 1/4
1/4 3/4
3/4 1/4
1/4 3/4
0 0
1 1
5/8
5/8
3/8
3/8
A C
Ch4: Channel4. 7 Channels in cascade
0 0
1 1
5/8
5/8
3/8
3/8
A C
ePP
log)1(
8/3log
8
3
)0(
8/5log
8
5
ePP
log)1(
8/5log
8
5
)0(
8/3log
8
3
)1()0( PP
25log8
53log
8
3
2/1
8/5log
8
5
2/1
8/3log
8
3C
23log8
35log
8
5
8
3log
8
3
8
5log
8
51)
8
3,
8
5(H2log C
Ch4: Channel4. 7 Channels in cascade
The transition probabilities pjk of an infinite cascade are given by
0 2
1
2
12
1 0
2
12
1
2
1 0
3
1
3
1
3
13
1
3
1
3
13
1
3
1
3
1