61
1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 [email protected]

1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 [email protected]

  • View
    226

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

1

16.548 Coding, Information Theory (and

Advanced Modulation)

Prof. Jay Weitzen

Ball 411

[email protected]

Page 2: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

2

Class Coverage

• Fundamentals of Information Theory (4 weeks)

• Block Coding (3 weeks)• Advanced Coding and modulation as a way

of achieving the Shannon Capacity bound: Convolutional coding, trellis modulation, and turbo modulation, space time coding (7 weeks)

Page 3: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

3

Course Web Site

• http://faculty.uml.edu/jweitzen/16.548– Class notes, assignments, other materials on

web site– Please check at least twice per week– Lectures will be streamed, see course website

Page 4: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

4

Prerequisites (What you need to know to thrive in this class)

• 16.363 or 16.584 (A Probability class)

• Some Programming (C, VB, Matlab)

• Digital Communication Theory

Page 5: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

5

Grading Policy• 4 Mini-Projects (25% each project)

•Lempel ziv compressor

• Cyclic Redundancy Check

• Convolutional Coder/Decoder soft decision

• Trellis Modulator/Demodulator

Page 6: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

6

Course Information and Text Books

• Coding and Information Theory by Wells, plus his notes from University of Idaho

• Digital Communication by Sklar, or Proakis Book

• Shannon’s original Paper (1948)

• Other material on Web site

Page 7: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

7

Claude Shannon Founds Science of Information theory in 1948

In his 1948 paper, ``A Mathematical Theory of Communication,'' Claude E. Shannon formulated the theory of data compression. Shannon established that there is a fundamental limit to lossless data compression. This limit, called the entropy rate, is denoted by H. The exact value of H depends on the information source --- more specifically, the statistical nature of the source. It is possible to compress the source, in a lossless manner, with compression rate close to H. It is mathematically impossible to do better than H.

Page 8: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

8

Page 9: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

9

This is Important

Page 10: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

10

Source Modeling

Page 11: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

11

Zero order models

It has been said, that if you get enough monkeys, and sit them down at enough typewriters, eventually they will complete the works of Shakespeare

Page 12: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

12

First Order Model

Page 13: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

13

Higher Order Models

Page 14: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

14

Page 15: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

15

Page 16: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

16

Page 17: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

17

Page 18: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

18

Page 19: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

19

Page 20: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

20

Zero’th Order Model

Page 21: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

21

Page 22: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

22

Definition of Entropy

Shannon used the ideas of randomness and entropy from the study of thermodynamics to estimate the randomness (e.g. information content or entropy) of a process

Page 23: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

23

Quick Review: Working with Logarithms

Page 24: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

24

Page 25: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

25

Page 26: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

26

Page 27: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

27

Page 28: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

28

Entropy of English Alphabet

Page 29: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

29

Page 30: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

30

Page 31: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

31

Page 32: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

32

Page 33: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

33

Page 34: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

34

Kind of Intuitive,

but hard to prove

Page 35: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

35

Page 36: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

37

Bounds on Entropy

Page 37: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

38

Math 495 Micro-Teaching

Quick Review:JOINT DENSITY OF

RANDOM VARIABLES

David Sherman

Bedrock, USA

Page 38: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

39

In this presentation, we’ll discuss the joint density of two random variables. This is a mathematical tool for representing the interdependence of two events.

First, we need some random variables.

Lots of those in Bedrock.

Page 39: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

40

Let X be the number of days Fred Flintstone is late to work in a given week. Then X is a random variable; here is its density function:

Amazingly, another resident of Bedrock is late with exactly the same distribution. It’s...

Fred’s boss, Mr. Slate!

N 1 2 3

F(N) .5 .3 .2

Page 40: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

41

N 1 2 3F(N) .5 .3 .2

Remember this means that P(X=3) = .2.

Let Y be the number of days when Slate is late. Suppose we want to record BOTH X and Y for a given week. How likely are different pairs?

We’re talking about the joint density of X and Y, and we record this information as a function of two variables, like this:

1 2 31 .35 .1 .052 .15 .1 .053 0 .1 .1

This means that

P(X=3 and Y=2) = .05.

We label it f(3,2).

Page 41: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

42

N 1 2 3F(N) .5 .3 .2

1 2 31 .35 .1 .052 .15 .1 .053 0 .1 .1

The first observation to make is that this joint probability function contains all the information from the density functions for X and Y (which are the same here).

For example, to recover P(X=3), we can add f(3,1)+f(3,2)+f(3,3).

.2The individual probability functions recovered in this way are called marginal.

Another observation here is that Slate is never late three days in a week when Fred is only late once.

Page 42: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

43

N 1 2 3F(N) .5 .3 .2

Since he rides to work with Fred (at least until the directing career works out), Barney Rubble is late to work with the same probability function too. What do you think the joint probability function for Fred and Barney looks like?

1 2 31 .5 0 02 0 .3 03 0 0 .2

It’s diagonal!

This should make sense, since in any week Fred and Barney are late the same number of days.

This is, in some sense, a maximum amount of interaction: if you know one, you know the other.

Page 43: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

44

N 1 2 3F(N) .5 .3 .2

A little-known fact: there is actually another famous person who is late to work like this.

SPOCK!

Before you try to guess what the joint density function for Fred and Spock is, remember that Spock lives millions of miles (and years) from Fred, so we wouldn’t expect these variables to influence each other at all.

(Pretty embarrassing for a Vulcan.)

In fact, they’re independent….

Page 44: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

45

N 1 2 3F(N) .5 .3 .2

1 2 31 .25 .15 .12 .15 .09 .063 .1 .06 .04

Since we know the variables X and Z (for Spock) are independent, we can calculate each of the joint probabilities by multiplying.

For example, f(2,3) = P(X=2 and Z=3)

= P(X=2)P(Z=3) = (.3)(.2) = .06.

This represents a minimal amount of interaction.

Page 45: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

46

Dependence of two events means that knowledge of one gives information about the other.

Now we’ve seen that the joint density of two variables is able to reveal that two events are independent ( and ), completely dependent ( and ), or somewhere in the middle ( and ).

Later in the course we will learn ways to quantify dependence. Stay tuned….

YABBA DABBA DOO!

Page 46: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

49

Page 47: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

50

Page 48: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

51

Marginal Density

Functions

Page 49: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

52

Conditional Probability

another event

)(

),()|(

BP

BAPBAP

Page 50: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

53

Conditional Probability (cont’d)

P(B|A)

Page 51: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

54

Definition of conditional probability

• If P(B) is not equal to zero, then the conditional probability of A relative to B, namely, the probability of A given B, is

P(A|B) = P(A B)

P(B)

P(A B) = P(B) P(A|B)

or

P(A B) = P(A) P(B|A)

Page 52: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

55

Conditional Probability

0.450.25 0.25

A B

P(A) = 0.25 + 0.25 = 0.50P(B) = 0.45 + 0.25 = 0.70P(A’) = 1 - 0.50 =0.50P(B’)= 1-0.70 =0.30

P(A|B) = P(A B)

P(B)

0 25

0 700 357

.

..

P(B|A) = P(A B)

P(A)

0 25

0 500 5

.

..

Page 53: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

56

Law of Total Probability

If B B ,......, and B are mutually exclusive events of which

one must occur, then for any event A

P(A) = P(B P(A|B + P(B

1 2 k

1 1 2

,

) ) ) ( | ) ...... ( ) ( | ) P A B P B P A Bk k2

P A P B P A B P B P A B( ) ( ) ( | ) ( ) ( | )' '

Special case of rule of Total Probability

Page 54: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

57

Bayes Theorem

Page 55: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

58

Page 56: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

59

Generalized Bayes’ theorem

If B B and B are mutually exclusive events of which

one must occur, then1 2 k, ,....

)/()(.......)/()()/()(

)/()()/(

2211 kk

iii BAPBPBAPBPBAPBP

BAPBPABP

k. 2,......, 1,=i for

Page 57: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

60

Urn Problems

• Applications of Bayes Theorem

• Begin to think about concepts of Maximum likelihood and MAP detections, which we will use throughout codind theory

Page 58: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

61

Page 59: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

62

Page 60: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

63

Page 61: 1 16.548 Coding, Information Theory (and Advanced Modulation) Prof. Jay Weitzen Ball 411 Jay_weitzen@uml.edu

64

End of Notes 1