Transcript
Page 1: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Agenda: Information, Probability, Statistical Entropy

• Information and probability

simple combinatorics, random variables.

Probability distributions, joint probabilities,

correlations

Statistical entropy

Monte Carlo simulations

• Partition of probability

• Phase space evolution (Eta Theorem)

• Partition functions for different degrees of freedom

• Gibbs stability criteria, equilibrium

Reading

Assignments

Weeks 3

&4

LN III:

Kondepudi Ch. 20

Additional Material

McQuarrie & Simon

Ch. 3 & 4

Math Chapters

MC B, C, D,

W. Udo Schröder 2021 Information&Probability 1

Page 2: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Information&Probability 2 W. Udo Schröder 2021

Statistics and Probability

Particles in a multi-particle systems move

often in random directions and with

quasi-random velocities, a result of

multiple scattering, see example of

molecular scattering off a crystal lattice.

Therefore, one can use statistical-

probabilistic arguments in discussing the

average properties of such systems and

perhaps fluctuations about that average.

Assumption of molecular chaos: All microscopic states that are energetically

available to a system are occupied by the system with equal probability (Ergodic

System) → same macro-state.

Page 3: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Aspects of Information Theory

In the absence of information, probability replaces certainty.

Information theory provides probability as an objective link be-tween randomness and certainty.

Important theorist Claude Shannon, “Father of Information Theory”

W. Udo Schröder 2021 Information&Probability 3

Illustration:

Two possible 2-dim micro-states for

a system of 100 particles distributed

differently over the available phase

space.

Random positions = first guess if nothing known about particles

“If a situation (event) is very likely to occur (high probability) →

information provided with its occurrence is low.” And vice versa.

Particles cluster ina corner → deducemutual attraction→ significant info

Claude Shannon 1948

Page 4: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Information&Probability 4 W. Udo Schröder 2021

Simple Probability Concepts

Probability: understood as being relative to set of many uncertain events, experiments, measurements, outcomes, …If instances of a type of event xn occur in random selection, without hidden preference (bias), one can estimate

Probability → combinatoricsConduct experiment/measurement →

probability for event type x

( )k kN frequency o nx f eve ts x=

( )

( )k

N

k

n

nP

xx = Lim

N( )

N x→

Probability a posteriori

Example: experiment (unbiased) rolling 1 dice 1000x →

Expect (a priori) face with “6” P(6) = 1/6 = 0.1667 → N(6) = 166 (167)

observe (a posteriori) face with “6” : N(6) = 173 → P(6) = 0.173 ≈ 1/6

Page 5: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Information&Probability 5 W. Udo Schröder 2021

Probability Distributions

Conduct many (M=5) measurements of 1000 rolls of one die → M-5 different

probabilities for event type x =“6”→ a posteriori probability

( )k kN frequency o nx f eve ts x=

( )

( )1 6

; 1,..6

6 ,i

iN

ik

NP( ) = Lim i M

kN→

=

Mean/average probability

{N(6)}= {175, 160, 155, 167, 182}

0.1

0.2

0

6i

P ( )

Measurement # i

( ) ( ) ( )1 1

16 : 6 6

M M

i i iii i

P w P PM= =

= =

( )

1

6 0.168

i

i

Measurements of equal quality

Equal weights w M

P

→ =

→ =

Page 6: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Information&Probability 6 W. Udo Schröder 2021

Probability Distributions

Conduct many (M=5) measurements of 1000 rolls of one die → M-5 different

probabilities for event type x =“6”→ a posteriori probability

( )k kN frequency o nx f eve ts x=

( )

( )1 6

; 1,..6

6 ,i

iN

ik

NP( ) = Lim i M

kN→

=

Variance/Standard Deviation

{N(6)}= {175, 160, 155, 167, 182}

0.1

0.2

0

6i

P ( )

Measurement # i

( ) ( )( )

( ) ( )

22

1

2 5

16 : 6

( 1)

6 2.4 10 6 0.005

M

P ii

P P

P PM M

=

= −−

= → =

( ) ( )6 0.168 0.005i

P =

( ): 6 1 6 0.167a prio Pri = =

Page 7: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Information&Probability 7 W. Udo Schröder 2021

Probability Combinations

Additive, inclusive or exclusive, cumulative,

multiplicative, conditional probabilities.

Sum & product rules for disjoint (independent)

probabilities

Outcome of one trial (→Event E1) has no effect

on the result of the next trial (→Event E2). The

corresponding probabilities are independent of

one another and add

A priori probability to get a face “6” in either of two trials, the first or the

second throw of a dice, equals the sum of both

1 2 1 2

1 1 1P (6) P (6) P (6)

6 6 3 = + = + =

Simultaneous (joint) a priori probability to get a face “6” in both trials,

the first and the second throw of a dice, equals the product of both

1 2 1 2

1 1 1P (6 | 6) P (6) P (6)

6 6 36 = = =

OR inclusive

AND

Page 8: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Properties of a priori Probabilities

• The probability for an event is 0 ≤ P ≤ 1

• The probability for any of the possible outcomes to occur is

• The probability of an impossible event is zero, P = 0.

• If two events (E1 and E2) are independent (disjoint, mutually

exclusive), the probability of the sum (“or”) event is the sum of the

probabilities,

• If two events (E1 and E2) are independent (disjoint, mutually exclusive),

the probability for the simultaneous event is the product of the

probabilities,

• If two events are not mutually exclusive,

• Additional considerations for conditional (marginal) probabilities,

example

W. Udo Schröder 2021 Information&Probability 8

1 2 1 2P P P = +

1 2 1 2P P P =

1 2 1 2 1 2P P P P = + −

1 2 1 2P E | E : Probability E , given E=

1ii

P P= =

Page 9: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Conditional Probabilities

Constraints on a set of events → conditional probability P{E|condition}.

Example drawing from 3 balls (R, G, G) hidden in a box. What are a priori probabilities for a sequence 1. draw, 2. draw,…. e.g. G2,R1…

Depends on 1. draw returned or not! If returned, uncorrelated draws →

W. Udo Schröder 2021 Information&Probability 9

( ) ( ) ( )2 1 2 1

2 21P G R P G P R =

3 3 9= =

If ball is not returned, correlated=conditional draws →

( ) ( )122 111

1

3

1|

3P G R RG PP R→ = =

Given that Red was drawn first

3. draw ( ) ( )1 13 2 2 13 2|

1 1P G G R P G R =

3 3G G 1RP= =

Page 10: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Statistical Event Domains

Possible relations are illustrated between hypothetical domains of probable events

W. Udo Schröder 2021 Information&Probability 10

and 1 2 1 2P E , P E , P E E

Independent events E1 and E2 or if E1 E2 → E2 has no influence on the

probability for E1 → Conditional

( ) 121 2E giveP n E TE | P E= =

If two events are mutually exclusive,

1 2 1 2P E E 0 P E | E = =

( ) ( )2 1

1 2 12

P E | EP E | E P E

P E=

( )1Prior probability P E suspicion, guess=

( )2 1 2

Fractional probability for

E T(if E T ) / Total P E= =

2E

1 2P E | E 0

( )

( )

1 2

1 2 1 2 2

2 1 1

If E ,E not disjo int :

P E E P E | E P E

P E | E P E

=

=

Page 11: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Discrete Multivariate Probability Distributions

Experiment: Draw random dies out of bag with colored dies (i=1-5). Roll each dies a large number of times. Record frequencies of faces (j=1,….,6) Normalize to total number of rolls. → 5x6 dimensional total probability domain {pij}. Independent constraints:

W. Udo Schröder 2021 Information&Probability 11

After Dill & Bromberg

6 5 5 6

1 1 1 1

1 1Facei ij i j ij jj i j j

C o u pol r u p = = = =

= → = = → = Check!

Colori

Face j

24p

Page 12: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Discrete Multivariate Probabilities

Compare a priori with posteriori probabilities to find bias. Example: blue die with face “4” from overlap (simultaneous) probability domains.

W. Udo Schröder 2021 Information&Probability 12

Data after Dill & Bromberg

2 4

: ( , #) 1 5; ( ,4) 1 6 ( ) (4) 1 30 0.033

( ) (4 9: ) 0.3 0.162 0.04

A P blue any P ai n

l

prior

A p

y P blue P

P b ue Ps uo teriori

= = → = =

= = =

20.049 0.033 1.5 0.2But u =

Page 13: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Continuous Probability Distributions

( ),P x y

xy

( ), ,... , ,...JDegrees of freedom x y Probabit lity P x yoin→

( ),y,...

)

1

(

dx dy P x

dx

N

Partia

l

l or c

b

onditional probability

P

ormalized Pro abi ity

x

=

( )

( ) ( )

,y,... 1

( ) ,y,... 1

dy P

P dx dy x

x

x a xP a

= −=

( )

0

22

( , ,. .....)P

xP

Average x P x y dx

and

x dy

x x

=

= −

( )2

22

1( ) exp

22 xx

Px

xx

− = −

Define normalized

Gaussian probability

W. Udo Schröder 2021 Information&Probability 13

Page 14: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Continuous Probability Distributions

( ),P x y

xy

( )

( )

( ) ( )

( ) , ,... 1

( ) , ,...

1

1

,y,...

Partial or conditional p

d

N b

y

ormalized Prob

P

a

y

il

robability

P dx P x

P dx y P x

dx dy x

y

ity

b

y

y b

=

=

=

( )

0

22

( , ,...) ...P

yP

Average y P x y d

y

x

an

y

yd

=

= −

( )2

22

1( ) exp

22 yy

Py

yy

− = −

Define normalized

Gaussian probability

W. Udo Schröder 2021 Information&Probability 14

( ), ,... , ,...JDegrees of freedom x y Probabit lity P x yoin→

Page 15: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

W. Udo Schröder 2021 Information&Probability 15

Correlations in Joint Distributions

For independent events (d.o.f) probabilities multiply. Correlations within distributions. Example: y increases with increasing x

( ) ( )2 2

2 22 2

( , ) ( ) ( )

1exp

2 24

= =

− − = − +

unc

x yx

P x y P x P y

x x y y

( ) ( ) ( )( )( )

2 2 2

2 2 2

2 2 2

1( , )

2

2exp

2

= −

− + − − − − −

corr

x y xy

x y xy

x y xy

P x y

x x y y x x y y

( ) ( )

( )

22 2 2 2 21

cot 42

; 1 1

x y x y xy

xy

xy xy x y xycorrelation coeffi rcie rnt

= − + − +

= −

( ) ( ): ( , ) −= − xy x x x dxdyCovarian yce Py y

x

y

Uncorrelated Punc(x,y)

x

yCorrelated Pcorr(x,y)

Page 16: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Probability Generating Functions

( )( )

P

Characteristic functions of P x

sam

Laplace tran

in

sforma

e

tio

o

n of

f

( ) ( )

( ) ( )

0

0

( ) :

:n n

s x

n n

n n s x

D

s

d ds dx e P

P

erivatives of

xds d

d e

s

x x x

=

= −

W. Udo Schröder 2021 Information&Probability 16

( ),P x y

xy

( ): 1Example dimensional system P x−

( ) ( ): s xs dx e P x− =

( ) ( )0

1n

nn

n

s

dx s

ds=

= −

( ) ( )0

:

s

ds dx

l

x P

simi ar

x xds =

= =

-(Set s=0)

Page 17: Agenda: Information, Probability, Statistical Entropy · 2021. 2. 23. · Agenda: Information, Probability, Statistical Entropy • Information and probability simple combinatorics,

Properties of a priori Probabilities

• The probability for an event is 0 ≤ P ≤ 1

• The probability for any of the possible outcomes to occur is

• The probability of an impossible event is zero, P=0.

• If two events (E) 1 and 2 are independent (disjoint, mutually

exclusive), the probabilities of the sum (“or”) event is the sum of the

probabilities,

• If two events 1 and 2 are independent (disjoint, mutually exclusive), the

probability for the simultaneous event is the product of the

probabilities,

• If two events are not mutually exclusive,

• Additional considerations for conditional (marginal) probabilities,

example

W. Udo Schröder 2021 Information&Probability 17

1 2 1 2P P P = +

1 2 1 2P P P =

1 2 1 2 1 2P P P P = + −

1 2 1 2P E | E : Probability E , given E=

1ii

P P= =


Recommended