94
Broadcast Channels Vlad Gheorghiu Department of Physics Carnegie Mellon University Pittsburgh, PA 15213, U.S.A. 1 October 2009 Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 1 / 20

Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast Channels

Vlad Gheorghiu

Department of PhysicsCarnegie Mellon University

Pittsburgh, PA 15213, U.S.A.

1 October 2009

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 1 / 20

Page 2: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Outline

1 References

2 Brief introduction to classical information theoryClassical ChannelsCodes and The Fundamental Channel Coding Theorem

3 Broadcast channelsThe problemMinimax schemesTime-sharing schemesCapacity region diagramsSuperimposing of information: Switch-and-talk channelsSuperimposing of information: BSC channels

4 Extension to the quantum regimeMocking a classical channel in the quantum regimeQuantum incompatibility

This talk is available online at http://quantum.phys.cmu.edu/QIP

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 2 / 20

Page 3: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

References

1 Thomas M. Cover and Joy A. Thomas, Elements of InformationTheory, Wiley-Interscience (2005)

2 Thomas M. Cover, Broadcast channels, IEEE Trans. Inf. Th., vol.18, no. 1, Jan. (1972)

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 3 / 20

Page 4: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 5: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Information is physical.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 6: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Information is physical.

Correlations between two physical systems.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 7: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Information is physical.

Correlations between two physical systems.

Letters in a newspapers get correlated to your visual receptors.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 8: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Information is physical.

Correlations between two physical systems.

Letters in a newspapers get correlated to your visual receptors.

Can one easily describe information from a mathematical point ofview?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 9: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Classical Channels

What is information?

Information is physical.

Correlations between two physical systems.

Letters in a newspapers get correlated to your visual receptors.

Can one easily describe information from a mathematical point ofview?

If you are Shannon, then yes.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 4 / 20

Page 10: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 11: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Source symbols from some finite alphabet are mapped into somesequence of channel symbols, which then produces the outputsequence of the channel.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 12: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Source symbols from some finite alphabet are mapped into somesequence of channel symbols, which then produces the outputsequence of the channel.

Each of the possible input sequences induces a probability distributionon the output sequence.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 13: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Source symbols from some finite alphabet are mapped into somesequence of channel symbols, which then produces the outputsequence of the channel.

Each of the possible input sequences induces a probability distributionon the output sequence.

Information theory is a probabilistic theory!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 14: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Source symbols from some finite alphabet are mapped into somesequence of channel symbols, which then produces the outputsequence of the channel.

Each of the possible input sequences induces a probability distributionon the output sequence.

Information theory is a probabilistic theory!

The goal is to reconstruct the input from the output with a negligibleprobability of error.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 15: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Introduce channels: correlations between two random variables.

Source symbols from some finite alphabet are mapped into somesequence of channel symbols, which then produces the outputsequence of the channel.

Each of the possible input sequences induces a probability distributionon the output sequence.

Information theory is a probabilistic theory!

The goal is to reconstruct the input from the output with a negligibleprobability of error.

The maximum rate at which this can be done is called the capacity ofthe channel.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 5 / 20

Page 16: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Discrete channels

A discrete channel is a system consisting of an input alphabet X andoutput alphabet Y and a probability transition matrix p(y |x) thatexpresses the probability of observing the output symbol y given that wesend the symbol x .

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 6 / 20

Page 17: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Discrete channels

A discrete channel is a system consisting of an input alphabet X andoutput alphabet Y and a probability transition matrix p(y |x) thatexpresses the probability of observing the output symbol y given that wesend the symbol x .

The channel is said to be memoryless if the probability distribution ofthe output depends only on the input at that time and is conditionalindependent of previous channel inputs or outputs.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 6 / 20

Page 18: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Discrete channels

A discrete channel is a system consisting of an input alphabet X andoutput alphabet Y and a probability transition matrix p(y |x) thatexpresses the probability of observing the output symbol y given that wesend the symbol x .

The channel is said to be memoryless if the probability distribution ofthe output depends only on the input at that time and is conditionalindependent of previous channel inputs or outputs.

Usual notation: (X , p(y |x),Y).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 6 / 20

Page 19: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Discrete channels

A discrete channel is a system consisting of an input alphabet X andoutput alphabet Y and a probability transition matrix p(y |x) thatexpresses the probability of observing the output symbol y given that wesend the symbol x .

The channel is said to be memoryless if the probability distribution ofthe output depends only on the input at that time and is conditionalindependent of previous channel inputs or outputs.

Usual notation: (X , p(y |x),Y).

Channel capacity of a discrete memoryless channel (Shannon)

The channel capacity is given by

C = maxp(x)I (X ;Y ),

where the maximum is taken over all possible input distributions p(x) andI (X ;Y ) is the mutual information between the random variables X and Y .

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 6 / 20

Page 20: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 21: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel. C = 1 bit.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 22: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel. C = 1 bit.

Example 2: Noisy Channel with Nonoverlapping Outputs.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 23: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel. C = 1 bit.

Example 2: Noisy Channel with Nonoverlapping Outputs. C = 1 bit.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 24: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel. C = 1 bit.

Example 2: Noisy Channel with Nonoverlapping Outputs. C = 1 bit.

Binary Symmetric Channel (BSC).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 25: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Classical Channels

Examples of channel capacities

Example 1: Noiseless Binary Channel. C = 1 bit.

Example 2: Noisy Channel with Nonoverlapping Outputs. C = 1 bit.

Binary Symmetric Channel (BSC). C = 1 − H(p) bits.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 7 / 20

Page 26: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 27: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 28: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 29: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Codes

An (M, n) code for a discrete memoryless channel (X , p(y |x),Y) consistsof the following:

1 An index set {1, 2, . . . , M}.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 30: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Codes

An (M, n) code for a discrete memoryless channel (X , p(y |x),Y) consistsof the following:

1 An index set {1, 2, . . . , M}.

2 An encoding function X n : {1, 2, . . . , M} → X n, yielding codewordsxn(1), xn(2), . . . , xn(M). The set of codewords is called the codebook.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 31: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Codes

An (M, n) code for a discrete memoryless channel (X , p(y |x),Y) consistsof the following:

1 An index set {1, 2, . . . , M}.

2 An encoding function X n : {1, 2, . . . , M} → X n, yielding codewordsxn(1), xn(2), . . . , xn(M). The set of codewords is called the codebook.

3 A decoding function g : Yn → {1, 2, . . . , M}, which is a deterministic rule

that assigns a guess to each possible received vector.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 32: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Codes

An (M, n) code for a discrete memoryless channel (X , p(y |x),Y) consistsof the following:

1 An index set {1, 2, . . . , M}.

2 An encoding function X n : {1, 2, . . . , M} → X n, yielding codewordsxn(1), xn(2), . . . , xn(M). The set of codewords is called the codebook.

3 A decoding function g : Yn → {1, 2, . . . , M}, which is a deterministic rule

that assigns a guess to each possible received vector.

The repetition code: 0 → 000, 1 → 111.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 33: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

Codes

Can one saturate (at least in theory) the channel capacity?

Yes, always, provided one is smart enough.

First, must introduce codes (a fancy name for redundancy).

Codes

An (M, n) code for a discrete memoryless channel (X , p(y |x),Y) consistsof the following:

1 An index set {1, 2, . . . , M}.

2 An encoding function X n : {1, 2, . . . , M} → X n, yielding codewordsxn(1), xn(2), . . . , xn(M). The set of codewords is called the codebook.

3 A decoding function g : Yn → {1, 2, . . . , M}, which is a deterministic rule

that assigns a guess to each possible received vector.

The repetition code: 0 → 000, 1 → 111. (M = 21,N = 3) code.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 8 / 20

Page 34: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

The Fundamental Channel Coding Theorem

Channel Coding Theorem (Shannon)

For a discrete memoryless channel, all rates below capacity C areachievable. Specifically, for every rate R < C , there exists a sequence of(2nR , n) codes with negligible maximum probability of error. Conversely,any sequence (2nR , n) codes with negligible probability of error must haveR ≤ C .

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 9 / 20

Page 35: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

The Fundamental Channel Coding Theorem

Channel Coding Theorem (Shannon)

For a discrete memoryless channel, all rates below capacity C areachievable. Specifically, for every rate R < C , there exists a sequence of(2nR , n) codes with negligible maximum probability of error. Conversely,any sequence (2nR , n) codes with negligible probability of error must haveR ≤ C .

Although the existence of good codes is proved, there is (yet) nomethod for constructing them for arbitrary channels. Mainly becausethe existence proof is based on the idea of random coding!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 9 / 20

Page 36: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Brief introduction to classical information theory Codes and The Fundamental Channel Coding Theorem

The Fundamental Channel Coding Theorem

Channel Coding Theorem (Shannon)

For a discrete memoryless channel, all rates below capacity C areachievable. Specifically, for every rate R < C , there exists a sequence of(2nR , n) codes with negligible maximum probability of error. Conversely,any sequence (2nR , n) codes with negligible probability of error must haveR ≤ C .

Although the existence of good codes is proved, there is (yet) nomethod for constructing them for arbitrary channels. Mainly becausethe existence proof is based on the idea of random coding!

This is one of the main research topics in Classical InformationTheory.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 9 / 20

Page 37: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 38: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

The goal is to send information with negligible probability of error toall receivers.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 39: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

The goal is to send information with negligible probability of error toall receivers.

More formally, one wants to find the set of simultaneously achievablerates (R1,R2, . . . Rk), or the capacity region.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 40: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

The goal is to send information with negligible probability of error toall receivers.

More formally, one wants to find the set of simultaneously achievablerates (R1,R2, . . . Rk), or the capacity region.

Why is it interesting? One can easily guess...

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 41: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

The goal is to send information with negligible probability of error toall receivers.

More formally, one wants to find the set of simultaneously achievablerates (R1,R2, . . . Rk), or the capacity region.

Why is it interesting? One can easily guess...

Broadcasting TV information, giving a lecture to a group of disparatebackgrounds and aptitudes etc.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 42: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels The problem

Broadcast channels: Basic problem

One source, k > 1 receivers.

The goal is to send information with negligible probability of error toall receivers.

More formally, one wants to find the set of simultaneously achievablerates (R1,R2, . . . Rk), or the capacity region.

Why is it interesting? One can easily guess...

Broadcasting TV information, giving a lecture to a group of disparatebackgrounds and aptitudes etc.

The broadcast problem belongs to the still open research area ofNetwork Information Theory.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 10 / 20

Page 43: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 44: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Send at rate Cmin =min(C1,C2, . . . ,Ck).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 45: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Send at rate Cmin =min(C1,C2, . . . ,Ck).

Even this is only possible when the channels are “compatible”(un-correlated, or orthogonal).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 46: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Send at rate Cmin =min(C1,C2, . . . ,Ck).

Even this is only possible when the channels are “compatible”(un-correlated, or orthogonal).

The transmission rate is limited by the worst channel.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 47: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Send at rate Cmin =min(C1,C2, . . . ,Ck).

Even this is only possible when the channels are “compatible”(un-correlated, or orthogonal).

The transmission rate is limited by the worst channel.

At the other extreme, one may try to send at rate R = Cmax, withresulting rates Ri = 0 for all but the best channel (say the k-th one),and Rk = Cmax for the best channel.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 48: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Minimax schemes

Naive approach: Minimax schemes

Suppose the transmission channels to the receiver have respectivechannel capacities C1,C2, . . . ,Ck bits per second.

Send at rate Cmin =min(C1,C2, . . . ,Ck).

Even this is only possible when the channels are “compatible”(un-correlated, or orthogonal).

The transmission rate is limited by the worst channel.

At the other extreme, one may try to send at rate R = Cmax, withresulting rates Ri = 0 for all but the best channel (say the k-th one),and Rk = Cmax for the best channel.

Definitely these schemes do not look optimal.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 11 / 20

Page 49: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Time-sharing schemes

Less naive approach: Time-sharing schemes

Next ideea: time sharing.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 12 / 20

Page 50: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Time-sharing schemes

Less naive approach: Time-sharing schemes

Next ideea: time sharing.

Allocate portions of time λ1, λ2, . . . , λk , λi ≥ 0,∑

λi = 1, to sendingat rates C1,C2, . . . ,Ck .

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 12 / 20

Page 51: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Time-sharing schemes

Less naive approach: Time-sharing schemes

Next ideea: time sharing.

Allocate portions of time λ1, λ2, . . . , λk , λi ≥ 0,∑

λi = 1, to sendingat rates C1,C2, . . . ,Ck .

Assuming compatibility of the channels and assumingC1 ≤ C2 ≤ · · · ≤ Ck , one finds that the rate of transmission throughthe i -th channel is

Ri =∑

j≤i

λjCj , i = 1, 2, . . . , k.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 12 / 20

Page 52: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Time-sharing schemes

Less naive approach: Time-sharing schemes

Next ideea: time sharing.

Allocate portions of time λ1, λ2, . . . , λk , λi ≥ 0,∑

λi = 1, to sendingat rates C1,C2, . . . ,Ck .

Assuming compatibility of the channels and assumingC1 ≤ C2 ≤ · · · ≤ Ck , one finds that the rate of transmission throughthe i -th channel is

Ri =∑

j≤i

λjCj , i = 1, 2, . . . , k.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 12 / 20

Page 53: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 54: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Orthogonal channels: P1 =

1 01 00 10 1

, P2 =

1 00 11 00 1

.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 55: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Orthogonal channels: P1 =

1 01 00 10 1

, P2 =

1 00 11 00 1

.

Completely incompatible channels: P1 =

1 00 112

12

12

12

, P2 =

12

12

12

12

1 00 1

.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 56: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Orthogonal channels: P1 =

1 01 00 10 1

, P2 =

1 00 11 00 1

.

Completely incompatible channels: P1 =

1 00 112

12

12

12

, P2 =

12

12

12

12

1 00 1

.

The Switch-and-talk channel. Analogy with a speaker fluent inSpanish and English must speak simultaneously to two listeners, oneof whom understands only Spanish and the other only English.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 57: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Orthogonal channels: P1 =

1 01 00 10 1

, P2 =

1 00 11 00 1

.

Completely incompatible channels: P1 =

1 00 112

12

12

12

, P2 =

12

12

12

12

1 00 1

.

The Switch-and-talk channel. Analogy with a speaker fluent inSpanish and English must speak simultaneously to two listeners, oneof whom understands only Spanish and the other only English.

Conjecture: the time-sharing scheme is optimal.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 58: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Capacity region diagrams

Examples of capacity region (or rate trade-off) diagrams

Rate trade-off examples for 1 → 2 broadcast channels, withX = {1, 2, 3, 4}, Y1 = {1, 2}, Y2 = {1, 2}.

Orthogonal channels: P1 =

1 01 00 10 1

, P2 =

1 00 11 00 1

.

Completely incompatible channels: P1 =

1 00 112

12

12

12

, P2 =

12

12

12

12

1 00 1

.

The Switch-and-talk channel. Analogy with a speaker fluent inSpanish and English must speak simultaneously to two listeners, oneof whom understands only Spanish and the other only English.

Conjecture: the time-sharing scheme is optimal. This conjecture isFALSE!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 13 / 20

Page 59: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 60: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?YES!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 61: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?YES!

“Super-impose” the information.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 62: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?YES!

“Super-impose” the information. Make use of the fact that theEnglish receiver does not understand Spanish and vice-versa but bothrealize when the sender does not broadcast in their language, to sendextra information, common to both parties!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 63: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?YES!

“Super-impose” the information. Make use of the fact that theEnglish receiver does not understand Spanish and vice-versa but bothrealize when the sender does not broadcast in their language, to sendextra information, common to both parties!

If channel 1 is used in proportion α of the time, then αC1

bits/transmission are received by Y1 and (1 − α)C2 by Y2.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 64: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

Smart approach: superimposing of information

Can one do better than time-sharing for the Switch-and-talk channel?YES!

“Super-impose” the information. Make use of the fact that theEnglish receiver does not understand Spanish and vice-versa but bothrealize when the sender does not broadcast in their language, to sendextra information, common to both parties!

If channel 1 is used in proportion α of the time, then αC1

bits/transmission are received by Y1 and (1 − α)C2 by Y2.

However, H(α) additional common bits/transmission may betransmitted, by choosing the ordering in which the channels are used(constrained by α).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 14 / 20

Page 65: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

In other words, modulation of the switch-to-talk button, subject tothe time-proportion constraint α, allows the perfect transmission of2nH(α) additional messages to both Y1 and Y2!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 15 / 20

Page 66: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: Switch-and-talk channels

In other words, modulation of the switch-to-talk button, subject tothe time-proportion constraint α, allows the perfect transmission of2nH(α) additional messages to both Y1 and Y2!

Thus all rates (R1,R2) of the form

(R1,R2) = (αC1 + H(α), (1 − α)C2 + H(α))

can be achieved by choosing the subset of n transmissions devoted to

the use of channel 1 in one of the

(

n

αn

)

≈ 2nH(α) possible ways.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 15 / 20

Page 67: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 68: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Naive rate trade-off diagram (on the board).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 69: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Naive rate trade-off diagram (on the board).

How to improve?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 70: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Naive rate trade-off diagram (on the board).

How to improve?

Use a code designed for a noisier channel, a cascade of 2 BSC ofparameter p and α.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 71: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Naive rate trade-off diagram (on the board).

How to improve?

Use a code designed for a noisier channel, a cascade of 2 BSC ofparameter p and α.

Fewer codewords, 2nC(αp+αp), but larger tolerated noise.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 72: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

Another example: Broadcast through BSC

Consider a broadcast scheme with two BSC, X = {1, 2}, Y1 = {1, 2},

Y2 = {1, 2} and P1 =

[

1 00 1

]

, P2 =

[

1 − p p

p 1 − p

]

(channel 1

is noiseless and channel 2 is a BSC with error probability p).

Naive rate trade-off diagram (on the board).

How to improve?

Use a code designed for a noisier channel, a cascade of 2 BSC ofparameter p and α.

Fewer codewords, 2nC(αp+αp), but larger tolerated noise.

Take advantage of this tolerance by packing in some extra messageinformation intended solely for the perfect receiver Y1.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 16 / 20

Page 73: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 74: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

This code structure allows the transmission of an arbitrary integerr ∈ {1, 2, . . . , 2n(C(αp+αp))} to both receivers 1 and 2.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 75: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

This code structure allows the transmission of an arbitrary integerr ∈ {1, 2, . . . , 2n(C(αp+αp))} to both receivers 1 and 2.

In addition, an arbitrary integer

s ∈

{

1, 2, . . . ,

(

n

αn

)}

can be sent only to receiver 1.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 76: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

This code structure allows the transmission of an arbitrary integerr ∈ {1, 2, . . . , 2n(C(αp+αp))} to both receivers 1 and 2.

In addition, an arbitrary integer

s ∈

{

1, 2, . . . ,

(

n

αn

)}

can be sent only to receiver 1.

Comments on the “cloud structure” and the decoding strategy.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 77: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

This code structure allows the transmission of an arbitrary integerr ∈ {1, 2, . . . , 2n(C(αp+αp))} to both receivers 1 and 2.

In addition, an arbitrary integer

s ∈

{

1, 2, . . . ,

(

n

αn

)}

can be sent only to receiver 1.

Comments on the “cloud structure” and the decoding strategy.

Since there are

(

n

αn

)

≈ 2nH(α) points per cloud, the transmission

rate for channel 1 is

R1 = C (αp + αp) + H(α).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 78: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

With each codeword x ∈ {0, 1}n we associate the set of all codewordsat Hamming distance equal to [αn] (see the picture on the board).

This code structure allows the transmission of an arbitrary integerr ∈ {1, 2, . . . , 2n(C(αp+αp))} to both receivers 1 and 2.

In addition, an arbitrary integer

s ∈

{

1, 2, . . . ,

(

n

αn

)}

can be sent only to receiver 1.

Comments on the “cloud structure” and the decoding strategy.

Since there are

(

n

αn

)

≈ 2nH(α) points per cloud, the transmission

rate for channel 1 is

R1 = C (αp + αp) + H(α).

Channel 2 has the rate of the cascaded BSC, e.g.

R2 = C (αp + αp).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 17 / 20

Page 79: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

This is better that the time-sharing scheme!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 18 / 20

Page 80: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Broadcast channels Superimposing of information: BSC channels

This is better that the time-sharing scheme!

Conclusion: superimpose high-rate information on low-rateinformation!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 18 / 20

Page 81: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 82: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 83: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

We call T a type of information.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 84: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

We call T a type of information.

This scenario “simulates” a classical broadcast channel, and theprojectors play the role of the input alphabet (one can regard them assymbols).

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 85: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

We call T a type of information.

This scenario “simulates” a classical broadcast channel, and theprojectors play the role of the input alphabet (one can regard them assymbols).

Distinguishablity of traced-down encoded projectors impliesinformation transfer.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 86: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

We call T a type of information.

This scenario “simulates” a classical broadcast channel, and theprojectors play the role of the input alphabet (one can regard them assymbols).

Distinguishablity of traced-down encoded projectors impliesinformation transfer.

Given the isometry and the type of information, one can easily findthe transition matrix.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 87: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Mocking a classical channel in the quantum regime

Classical channels in the quantum regime

Introduce an isometry from an input Hilbert space Hin to amulti-partite Hilbert space HA ⊗HB ⊗ · · · ⊗ HZ . For our purposes,let’s restrict to a bipartite splitting.

Let T = {Pj} be a decomposition of the identity on the input Hilbertspace, in terms of a set of orthogonal projectors, I =

Pj .

We call T a type of information.

This scenario “simulates” a classical broadcast channel, and theprojectors play the role of the input alphabet (one can regard them assymbols).

Distinguishablity of traced-down encoded projectors impliesinformation transfer.

Given the isometry and the type of information, one can easily findthe transition matrix.

In principle, the capacity diagram can be calculated.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 19 / 20

Page 88: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 89: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 90: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Example: σX and σZ Pauli spin operators define two incompatibletypes through their spectral decomposition, i.e. {|+〉〈+|, |−〉〈−|} isincompatible with respect to {|0〉〈0|, |1〉〈1|}.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 91: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Example: σX and σZ Pauli spin operators define two incompatibletypes through their spectral decomposition, i.e. {|+〉〈+|, |−〉〈−|} isincompatible with respect to {|0〉〈0|, |1〉〈1|}.

Main question: given some isometry and the capacity diagram forsome type of information T , what are the restrictions (if any) imposedon the capacity diagram for some other incompatible type Q?

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 92: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Example: σX and σZ Pauli spin operators define two incompatibletypes through their spectral decomposition, i.e. {|+〉〈+|, |−〉〈−|} isincompatible with respect to {|0〉〈0|, |1〉〈1|}.

Main question: given some isometry and the capacity diagram forsome type of information T , what are the restrictions (if any) imposedon the capacity diagram for some other incompatible type Q?

The notion of incompatibility is quantum-mechanical. There is noclassical analog!

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 93: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Example: σX and σZ Pauli spin operators define two incompatibletypes through their spectral decomposition, i.e. {|+〉〈+|, |−〉〈−|} isincompatible with respect to {|0〉〈0|, |1〉〈1|}.

Main question: given some isometry and the capacity diagram forsome type of information T , what are the restrictions (if any) imposedon the capacity diagram for some other incompatible type Q?

The notion of incompatibility is quantum-mechanical. There is noclassical analog!

This is a hard question, for which we do not have (yet) an answer.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20

Page 94: Broadcast Channels - Quantumquantum.phys.cmu.edu/QIP/broadcast_channels.pdf · Introduce channels: correlations between two random variables. Source symbols from some finite alphabet

Extension to the quantum regime Quantum incompatibility

Incompatible types induce different classical channels onthe same isometry

Two types of information are compatible if and only if all projectors inone commute with all projectors of the other one.

Otherwise, they are called incompatible.

Example: σX and σZ Pauli spin operators define two incompatibletypes through their spectral decomposition, i.e. {|+〉〈+|, |−〉〈−|} isincompatible with respect to {|0〉〈0|, |1〉〈1|}.

Main question: given some isometry and the capacity diagram forsome type of information T , what are the restrictions (if any) imposedon the capacity diagram for some other incompatible type Q?

The notion of incompatibility is quantum-mechanical. There is noclassical analog!

This is a hard question, for which we do not have (yet) an answer.

An answer may be of use in the study of quantum channel capacities.

Vlad Gheorghiu (CMU) Broadcast Channels 1 October 2009 20 / 20