29
Information Complexity: an Overview Rotem Oshman, Princeton CCI Based on work by Braverman, Barak, Chen, Rao, and others Charles River Science of Information Day 2014

Information Complexity: an Overview

  • Upload
    meris

  • View
    36

  • Download
    0

Embed Size (px)

DESCRIPTION

Information Complexity: an Overview. Rotem Oshman, Princeton CCI Based on work by Braverman , Barak, Chen, Rao, and others Charles River Science of Information Day 2014. Classical Information Theory. Shannon ‘48, A Mathematical Theory of Communication :. - PowerPoint PPT Presentation

Citation preview

Page 1: Information Complexity: an Overview

Information Complexity: an Overview

Rotem Oshman, Princeton CCIBased on work by Braverman, Barak, Chen, Rao,

and othersCharles River Science of Information Day 2014

Page 2: Information Complexity: an Overview

Classical Information Theory

• Shannon ‘48, A Mathematical Theory of Communication:

Page 3: Information Complexity: an Overview

Motivation: Communication Complexity

𝑋 𝑌

= ?

Yao ‘79, “Some complexity questions related to distributive computing”

Page 4: Information Complexity: an Overview

Motivation: Communication Complexity

𝑋 𝑌

More generally: solve some task

Yao ‘79, “Some complexity questions related to distributive computing”

Page 5: Information Complexity: an Overview

• Applications:– Circuit complexity– Streaming algorithms– Data structures– Distributed computing– Property testing– …

Motivation: Communication Complexity

Page 6: Information Complexity: an Overview

Example: Streaming Lower Bounds

• Streaming algorithm:

• Reduction from communication complexity [AMS’97]

data

algorithmHow much spaceis required to approximate f(data)?

Page 7: Information Complexity: an Overview

Example: Streaming Lower Bounds

• Streaming algorithm:

• Reduction from communication complexity [Alon, Matias, Szegedy ’99]

data

algorithm

State of thealgorithm

Page 8: Information Complexity: an Overview

Advances in Communication Complexity

• Very successful in proving unconditional lower bounds, e.g.,– for set disjointness [KS’92, Razborov ‘92]– for gap hamming distance [Chakrabarti, Regev ‘10]

• But stuck on some hard questions– Multi-party communication complexity– Karchmer-Wigderson games

• [Chakrabarty, Shi, Wirth, Yao ’01], [Bar-Yossef, Kumar, Jayram, Srivakumar ‘04]: use tools from information theory

Page 9: Information Complexity: an Overview

Extending Information Theory to Interactive Computation

• One-way communication:– Task: send across the channel– Cost: bits• Shannon: in the limit over many instances• Huffman: bits for one instance

• Interactive computation:– Task: e.g., compute – Cost?

Page 10: Information Complexity: an Overview

Information Cost

• Reminder: mutual information

• Conditional mutual information:

• Basic properties:

– and – Chain rule:

Page 11: Information Complexity: an Overview

Information Cost

• Fix a protocol • Notation abuse: let also denote the transcript

of the protocol• Two ways to measure information cost:– External information cost: – Internal information cost: – Cost of a task: infimum over all protocols– Which cost is “the right one”?

Page 12: Information Complexity: an Overview

Information Cost: Basic Properties

External information: Internal information:

• Internal external• Can be much smaller, e.g.:– uniform over – Alice sends to Bob

• But equal if inependent

Page 13: Information Complexity: an Overview

Information Cost: Basic Properties

External information: Internal information:

• External information communication:

Page 14: Information Complexity: an Overview

Information Cost: Basic Properties

• Internal information communication cost:

• By induction: let .• : what we know after r rounds

what we knew after r-1 rounds

what we learn in round r, given what we already know

I.H.

Page 15: Information Complexity: an Overview

Information vs. Communication

• Want: • Suppose is sent by Alice.• What does Alice learn?– is a function of and so

• What does Bob learn?

Page 16: Information Complexity: an Overview

Information vs. Communication

• We have:Internal information communicationExternal information communicationInternal information external information

Page 17: Information Complexity: an Overview

Information vs. Communication

• “Information cost = communication cost”?– In the limit: internal information! [Braverman, Rao ‘10]– For one instance: external information! [Braverman,

Barak, Rao, Chen ‘10]

Big question: can protocols be compressed down to their internal information cost?– [Ganor, Kol, Raz ’14]: no!– There is a task with internal IC=, CC=.… but: remains open for functions, small output.

Page 18: Information Complexity: an Overview

Information vs. Amortized Communication

• Theorem [Braverman, Rao ‘10]:

• The “” direction: compression• The “” direction: direct sum• We know: • We can show:

Page 19: Information Complexity: an Overview

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set – Bad idea: publicly sample

𝑈

𝑉

𝑋

𝑌

Page 20: Information Complexity: an Overview

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set – Bad idea: publicly sample

Suppose in , Alice sends .In , Bob learns one bit in he should learn bit

But if is public Bob learns 1 bit about !

Page 21: Information Complexity: an Overview

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set

𝑈

𝑉

Publicly sample

Publicly sample

Privately sample

Privately sample

𝑋

𝑌

Page 22: Information Complexity: an Overview

Compression

• What we know: a protocol with communication , internal info and external info can be compressed to– [BBCR’10]– [BBCR’10]– [Braverman’10]

• Major open question: can we compress to [GKR, partial answer: no]

Page 23: Information Complexity: an Overview

Using Information Complexity to Prove Communication Lower Bounds

• Internal/external info communication• Essentially the most powerful technique known

[Kerenidis,Laplante,Lerays,Roland,Xiao’12]: most lower bound techniques imply IC lower bounds

• Disadvantage: hard to show incompressibility!– Must exhibit problem with low IC, high CC– But proving high CC usually proves high IC…

Page 24: Information Complexity: an Overview

Extending IC to Multiple Players

• Recent interest in multi-player number-in-hand communication complexity

• Motivated by “big data”:– Streaming and sketching, e.g., [Woodruff, Zhang

‘11,’12,’13]– Distributed learning, e.g., [Awasthi, Balcan, Long ‘14]

Page 25: Information Complexity: an Overview

Extending IC to Multiple Players

• Multi-player computation traditionally hard to analyze

• [Braverman,Ellen,O.,Pitassi,Vaikuntanathan]: for Set Disjointness with elements, players, private channels, NIH input

Page 26: Information Complexity: an Overview

Information Complexity on Private Channels

• First obstacle: secure multi-party computation• [Goldreich,Micali,Wigderson’87]: any function can be

computed with perfect information-theoretic security against players

– Solution: redefine information cost, measure both• Information a player learns, and• Information a player leaks to all the others.

Page 27: Information Complexity: an Overview

Extending IC to Multiple Players

• Set disjointness:– Input: – Output:

• Open problem: can we extend to gap set disjointness?– First step: “purely info-theoretic” 2-party analysis

Page 28: Information Complexity: an Overview

Extending IC to Multiple Players

• In [Braverman,Ellen,O.,Pitassi,Vaikuntanathan] we show direct sum for multi-party– Solving instances = solving one instance

• Does direct sum hold “across players”?– Solving with players = solving with 2 players?– Not always

• Does compression work for multi-party?

Page 29: Information Complexity: an Overview

Conclusion

• Information complexity extends classical information theory to the interactive setting

• Picture is much less well-understood• Powerful tool for lower bounds• Fascinating open problems:– Compression– Information complexity for multi-player

computation, quantum communication, …