View
231
Download
1
Embed Size (px)
Citation preview
Distributed Source CodingDistributed Source Coding
教師 教師 : : 楊士萱 老師楊士萱 老師學生 學生 : : 李桐照李桐照
Talk OutLineTalk OutLine•Introduction of DSCIntroduction of DSC
•Introduction of SWCQIntroduction of SWCQ
•ConclusionConclusion
IntroductionIntroduction of DSC of DSC
Distributed Source Coding
Compression of two or more correlated source
•The source do not communicate with each other (hence distributed coding)
•Decoding is done jointly (say at the base station)
Source X
Source Y
Source Encoder X
Source Encoder Y
Joint Decoder
Divination X,Y
IntroductionIntroduction of DSC of DSC
Source X
Source Y
Source Encoder X
Source Encoder Y
Joint Source Decoder
Divination X,Y
Introduction of SWCQIntroduction of SWCQ
Review of Information Theory
Definition: (DMS) I ( P(x) ) = log1/ P(x) = – log P(x) If we use the base 2 logs, the resulting unit of information is call a bit
Definition: (DMS) The Entropy H(X) of a discrete random variable X is defined by
Information
Entropy
p(x)p(x)logH(X)H(p)Xx
2
Introduction of SWCQIntroduction of SWCQ
Review of Information Theory
Definition: (DMS) The joint entropy of 2 RV X,Y is the amount of the information needed on average to specify both their values
Joint Entropy
Xx y
Y)y)logp(X,p(x,Y)H(X,Y
Conditional EntropyDefinition: (DMS) The conditional entropy of a RV Y given another X, expresses how much extra information one still needs to supply on average to communicate Y given that the other party knows X
X)|logp(YE x)|y)logp(yp(x,
x)|x)logp(y|p(yp(x)
x)X|p(x)H(YX)|H(Y
Xx Yy
Xx Yy
Xx
Introduction of SWCQIntroduction of SWCQ
Review of Information Theory
Definition: (DMS) I(X,Y) is the mutual information between X and Y. It is the reduction of uncertainty of one RV due to knowing about the other, or the amount of information one RV contains about the other
Mutual Information
Y)I(X, X)|H(Y -H(Y) Y)|H(X-H(X)
Y)|H(XH(Y) X)|H(YH(X) Y)H(X,
Introduction of SWCQIntroduction of SWCQ
Review of Information Theory
H(YlX)H(XlY) I(X;Y)
H(X) H(Y)
Mutual Information
Y)I(X, X)|H(Y -H(Y) Y)|H(X-H(X)
Y)|H(XH(Y) X)|H(YH(X) Y)H(X,
Introduction of SWCQIntroduction of SWCQ
Review of Data Compression
Transform Coding:Transform Coding:
Take a sequence of inputs and transform them into another Take a sequence of inputs and transform them into another sequence in which most of thesequence in which most of the informationinformation is contained in is contained in
only a few elements. only a few elements.
And, then discarding the elements of the sequence that do not And, then discarding the elements of the sequence that do not contain much information, we can get a large amount of contain much information, we can get a large amount of
compression.compression.Nested quantization: quantization with side info
Slepian-Wolf coding: entropy coding with side info
Source X
Source Y
Source Encoder X
Source Encoder Y
Joint Source Decoder
Divination X,Y
Introduction of SWCQIntroduction of SWCQ
KLT Quantization Entropy Coding
Classic Source Coding
Introduction of SWCQIntroduction of SWCQ
KLT Quantization Entropy Coding
Classic Source Coding
KLT Quantization Syndrome -Based Entropy Coding
Introduction of SWCQIntroduction of SWCQ
KLT Quantization Entropy Coding
Classic Source Coding
KLT Quantization Syndrome -Based Entropy Coding
KLTNested
QuantizationSlepian-Wolf
Coding
SWCQSWCQ
DSC
•Introduction of SWCQIntroduction of SWCQ
A Case of SWC
Source X
Source Y
Source Encoder XRx(rate)
Source Encoder YRy(rate)
Joint Source Decoder
Divination X,Y
Z
W
•Introduction of SWCQIntroduction of SWCQ
A Case of SWC
Source X
Source Y
Source Encoder XRx(rate)
Source Encoder YRy(rate)
Joint Source Decoder
Divination X,Y
Z
W
Joint Encoding (Y is available when coding X)
Joint Encoding
Source X
Source Y
Source Encoder XRx(rate)
Source Encoder YRy(rate)
Joint Source Decoder
Divination X,Y
Z
W• Code Y at Ry≧ H(Y) : use Y to predict X and th
en code the difference at Rx≧H(XlY) • All together, Rx+Ry≧ H(XlY)+H(Y)=H(X,Y)
•Introduction of SWCQIntroduction of SWCQ
A Case of SWC
Source X
Source Y
Source Encoder XRx(rate)
Source Encoder YRy(rate)
Joint Source Decoder
Divination X,Y
Z
W
Distributed Encoding (Y is not available when coding X)• What is the min rate to code X in this case?• SW Theorem: Still H(XlY)
Separate encoding as efficient as joint encoding
•Introduction of SWCQIntroduction of SWCQ
A Case of SWC
Source XSource Encoder X
Rx(rate)
Source X
Source Y
Source Encoder XRx(rate)
Source Encoder YRy(rate)
Joint Source Decoder
Divination X,Y
Source YSource Encoder Y
Ry(rate)
Our Focus
RCSCmin =H(X)+H(Y)
RDSCmin =H(X,Y)
RCSCmin>= RDSCmin
Introduction of SWCQIntroduction of SWCQ
The SW Rate Region (for two sources) The SW Rate Region (for two sources)
RX
RY
Slepian-Wolf
H(X)H(X|Y)
H(Y|X)
H(Y)Compression of X with
side information Y at the Joint decoder
Achievable rates for distributed compression
ConclusiConclusionon : :
Compression of two or more correlated sources use DSC good than CSC.