Shannon Entrophy Basic

Embed Size (px)

Citation preview

  • 7/21/2019 Shannon Entrophy Basic

    1/20

    A Mathematical Theory of Communication byClaude shannon

    Gowshigan.S

    Asian Institute of Technology Thailand

    Information Theory and Coding 20000471

    07.04.2014

    Gowshigan.S (AIT) Presenation 07.04.2014 1 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    2/20

    Overview

    1 Introduction

    2 Communication system

    3 Part 1: Discrete Noise Less systemsThe discrete noiseless channelThe discrete source of InformationGraphical Representation of Markov ProcessErgodic and Mixed sourcesChoice, Uncertainty And Entropy

    The Entropy of An Information SourceThe Discrete channel with Noise

    4 Second Section

    Gowshigan.S (AIT) Presenation 07.04.2014 2 / 20

    http://goforward/http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    3/20

    Introduction

    Claude Elwood Shannon was an American mathematician, electronic en-

    gineer, and cryptographer known as the father of information theory.Shannon is famous for having founded information theory with a landmarkpaper that he published in 1948

    Gowshigan.S (AIT) Presenation 07.04.2014 3 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    4/20

    Communication System

    Gowshigan.S (AIT) Presenation 07.04.2014 4 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    5/20

    Communication System- Continued

    1 information source : which produce messages or sequences to becommunicated to the receiving terminal.

    A sequence of letters as in a telegraph of Teletype systemA single function of time f(t ) as in radio or telephonyA function of time and other variables as in black and white televisionTwo or more functions of time, say f(t ), g(t ), h (t )Several functions of several variablesVarious combinations also occur, for example in television with anassociated audio channel

    2 A transmitter: operates on messages in some way to produce a signalsuitable for tranmssion over the channel. Ex Vocoder systems ,Televisions, Frequency modulation

    Gowshigan.S (AIT) Presenation 07.04.2014 5 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    6/20

    Communication System- Continued

    1 channel: it is the medium to transmit the signal from the transmitterto receiver

    2 receiver : Ordinarily perform the inverse operation of that done bytransmitter , reconstruction the message from the signal

    3 Destination : is the person or identity to who or which the messagesis intended

    Gowshigan.S (AIT) Presenation 07.04.2014 6 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    7/20

    The Discrete source of Information

    Logarithm of the number of possible signal in a discrete channelincrease linearly with timeThe capacity to transmit information can be specied by giving therate of increaseHow much information source to be described mathematically? How much information bits per second in produced ?

    Frequent letters: shortest sequence

    Infrequent Letters: Longest sequence

    Gowshigan.S (AIT) Presenation 07.04.2014 7 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    8/20

    The Discrete source of Information- Continued

    Gowshigan.S (AIT) Presenation 07.04.2014 8 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    9/20

    The discrete Noiseless Channel- Continued

    Lets dene Stochastic Process and Discrete source

    1 A Physical System or Mathematical model System of a System whichproduces symbols according to certain probabilities is known asStochastic Process

    2 conversely any Stochastic Process which produces a discretesequence of symbols from a nite set may be considered a discretesource

    Gowshigan.S (AIT) Presenation 07.04.2014 9 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    10/20

    Markov Process and Graphical Representation

    Consider the Example (B):

    Using the same ve letters let the probabilities be .4, .1, .2, .2, .1,respectively, with successive choices independent. A typical messagefrom this source is then:A A A C D C B D C E A A D A D A C E D AE A D C A B E D A D D C E C A A A A A D.

    Corresponding Markov Process

    Gowshigan.S (AIT) Presenation 07.04.2014 10 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    11/20

    Markov Process and Graphical Representation- Continued

    Gowshigan.S (AIT) Presenation 07.04.2014 11 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    12/20

    Ergodic And Mixed Sources

    Simple Introduction to Ergodic Process

    Among the possible discrete Markov Process there is a group withspecial properties of signicance in communication theory and this

    special class is called ergodic process and corresponding sources arecalled ergodic sourcesIn ergodic process every sequences produced by process is the same isstatical properties and that is the one distinguish this from otherprocess

    Gowshigan.S (AIT) Presenation 07.04.2014 12 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    13/20

    Choice, Uncertainty And Entropy

    Lets dene a quantity which will measure how much information is pro-duced by a processS uppose we have a set of possible events whose probabilities of occur-rence are p1,p2 ,..... pn.

    andIf there is such a measure, say H( p1,p2,,,,pn )

    H should be continuous in the PiIf all the pi are equal, Pi=(1/n) then H should be a monotonic

    increasing function of n. With equally likely events there is morechoice, or uncertainty, when there are more possible events.

    Gowshigan.S (AIT) Presenation 07.04.2014 13 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    14/20

    Choice, Uncertainty And Entropy

    If a choice be broken down into two successive choices, the original Hshould be the weighted sum of the individual values of H. Themeaning of this is illustrated in Fig. 6. At the left we have three

    Gowshigan.S (AIT) Presenation 07.04.2014 14 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    15/20

    Choice, Uncertainty And Entropy

    T he only H satisfying the three above assumptions is the formH = K n1 P i logP i w here K is a positive constant, merely amounts to choice of unit measure

    T he entropy in the case of two probabilities p and q = 1 p , namelyH = (p log p + q logq ) is plotted as a function of p

    Gowshigan.S (AIT) Presenation 07.04.2014 15 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    16/20

    Choice, Uncertainty And Entropy- Continued

    p revious graph has some interesting properties1

    H = 0 if and only if all the p i but one are zero, this one having thevalue unity. Thus only when we are certain of the outcome does H vanish. Otherwise H is positive.

    2 For a given n, H is a maximum and equal to logn when all the p i areequal (i.e., 1/n ). This is also intuitively the most uncertainsituation.

    3 Suppose there are two events, x and y , in question with mpossibilities for the rst and n for the second. Letp (i , j ) be theprobability of the joint occurrence of i for the rst and j for the

    second. The entropy of the joint event isH (x , y ) = ni , j p (i , j )log p (i , j )and it is easily shown that H (x , y ) H (x ) + H (y )

    4 Any change toward equalization of the probabilitiesp 1, p 2, , , , , , p n increasesH

    Gowshigan.S (AIT) Presenation 07.04.2014 16 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    17/20

    The Entropy of an Information source

    C onsider a discrete source of the nite state type considered above. Foreach possible state i there will be a set of probabilities p i ( j ) of producingthe various possible symbols j .

    Thus there is an entropy Hi for each state.The entropy of the source will be dened as the average of these Hi

    weighted in accordance with the probability of occurrence of the states inquestion: H = i P i H i = i , j P i p i ( j )log p i ( j )

    This is the entropy of the source per symbol of text.

    Gowshigan.S (AIT) Presenation 07.04.2014 17 / 20

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    18/20

    Discussion

    As a simple example of some of these results consider a source whichproduces a sequence of letters chosen from among A, B , C , D with

    probabilities 12 , 14 , 18 , 18 successive symbols being chosen independently.Wehave H = ( 12 log

    12 +

    14 log

    14 +

    28 log

    18

    = 74 bitspersymbol

    Gowshigan.S (AIT) Presenation 07.04.2014 18 / 20

    h h l h

    http://find/
  • 7/21/2019 Shannon Entrophy Basic

    19/20

    The Discrete Channel with noise

    1 Received signal is not same as Transmitted signal2 E = f (S , N ) received signal can be assumed as a function of

    transmitted signal and received signal3 Noise also can be considered as stochastic process

    P , i (, j )4 The capacity of noisy channel can be dened as

    C = Max (H (x ) H y (y ))

    Gowshigan.S (AIT) Presenation 07.04.2014 19 / 20

    http://find/http://goback/
  • 7/21/2019 Shannon Entrophy Basic

    20/20

    The End

    Gowshigan.S (AIT) Presenation 07.04.2014 20 / 20

    http://find/