Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
1896 1920 1987 2006
ComputingandCommunications2.InformationTheory
-EntropyYingCui
DepartmentofElectronicEngineeringShanghaiJiaoTongUniversity,China
2018,Autumn
1
Outline
• Entropy• Jointentropyandconditionalentropy• Relativeentropyandmutualinformation• Relationshipbetweenentropyandmutualinformation
• Chainrulesforentropy,relativeentropyandmutualinformation
• Jensen’sinequalityanditsconsequences
2
Reference
• Elementsofinformationtheory,T.M.CoverandJ.A.Thomas,Wiley
3
OVERVIEW
4
InformationTheory
• Informationtheoryanswerstwofundamentalquestionsincommunicationtheory– whatistheultimatedatacompression?-- entropyH
– whatistheultimatetransmissionrateofcommunication?-- channelcapacityC
• Informationtheoryisconsideredasasubsetofcommunicationtheory
5
InformationTheory
• Informationtheoryhasfundamentalcontributionstootherfields
6
AMathematicalTheoryofCommun.
• In1948,Shannonpublished“AMathematicalTheoryofCommunication”,foundingInformationTheory
• Shannonmadetwomajormodificationshavinghugeimpactoncommunicationdesign– thesourceandchannelaremodeledprobabilistically– bitsbecamethecommoncurrencyofcommunication
7
AMathematicalTheoryofCommun.
• Shannonprovedthefollowingthreetheorems– Theorem1.Minimumcompressionrateofthesourceisitsentropy
rateH– Theorem2.Maximumreliablerateoverthechannelisitsmutual
informationI– Theorem3.End-to-endreliablecommunicationhappensifandonlyif
H<I,i.e.thereisnolossinperformancebyusingadigitalinterfacebetweensourceandchannelcoding
• ImpactsofShannon’sresults– afteralmost70years,allcommunicationsystemsaredesignedbased
ontheprinciplesofinformationtheory– thelimitsnotonlyserveasbenchmarksforevaluatingcommunication
schemes,butalsoprovideinsightsondesigninggoodones– basicinformationtheoreticlimitsinShannon’stheoremshavenow
beensuccessfullyachievedusingefficientalgorithmsandcodes8
ENTROPY
9
Definition
• Entropyisameasureoftheuncertaintyofar.v.• Considerdiscrete r.v.Xwithalphabet andp.m.f.
– logistothebase2,andentropyisexpressedinbits• e.g.,theentropyofafaircointossis1bit
– define,since• addingtermsofzeroprobabilitydoesnotchangetheentropy
10
X( ) Pr[ ], p x X x x= = ÎX
log 0 as 0x x x® ®0log0 0=
Properties
– entropyisnonnegative
– baseoflogcanbechanged
11
Example
– H(X)=1bitwhenp=0.5• maximumuncertainty
– H(X)=0bitwhenp=0 or1• minimumuncertainty
– concavefunctionofp
12
Example
13
JOINTENTROPYANDCONDITIONALENTROPY
14
JointEntropy
• Jointentropyisameasureoftheuncertaintyofapairofr.v.s
• Considerapairofdiscreter.v.s (X,Y)withalphabetandp.m.f.s
15
,X Y( ) Pr[ ], ( ) Pr[ ], p x X x x p y Y y y= = Î = = Î,X Y
ConditionalEntropy
• Conditionalentropyofar.v.(Y)givenanotherr.v.(X)– expectedvalueofentropiesofconditionaldistributions,averagedoverconditioningr.v.
16
ChainRule
17
ChainRule
18
Example
19
Example
20
RELATIVEENTROPYANDMUTUALINFORMATION
21
RelativeEntropy
• Relativeentropyisameasureofthe“distance”betweentwodistributions
– convention:– ifthereisany
22
0 00log 0, 0 log 0 and log0 0
ppq
= = = ¥
such that ( ) 0 and ( ) 0, then ( || ) .x p x q x D p qÎ > = = ¥X
Example
23
MutualInformation
• Mutualinformationisameasureoftheamountofinformationthatoner.v.containsaboutanotherr.v.
24
RELATIONSHIPBETWEENENTROPYANDMUTUALINFORMATION
25
Relation
26
Proof
27
chainruleforentropy
Illustration
28
CHAINRULESFORENTROPY,RELATIVEENTROPY,ANDMUTUALINFORMATION
29
ChainRuleforEntropy
30
Proof
31
AlternativeProof
32
ChainRuleforMutualInformation
33
Proof
34
chainruleforentropy chainruleforconditionalentropy
ChainRuleforRelativeEntropy
35
relativeentropybetweenconditionalp.m.f.s
Proof
36
JENSEN'SINEQUALITYANDITSCONSEQUENCES
37
Convex&ConcaveFunctions
• Examples:
38
2convex functions: , | |, , log (for 0)xx x e x x x ³
concave functions: log and (for 0)x x x ³
linear functions are both convex and concaveax b+
Convex&ConcaveFunctions
39
Jensen’sInequality
40
InformationInequality
41
Proof
42
NonnegativityofMutualInformation
43
Max.EntropyDist.– UniformDist.
44
ConditioningReducesEntropy
45
IndependenceBoundonEntropy
46
Summary
47
Summary
48
Summary
49
[email protected]/Personal/yingcui
50