5
A Hundred Years of Entropy From primitive machines and the discovery of fire came problems of mechanics and heat. The steam engine brought them forcibly together. Out of the study of heat-to-work transformations that resulted, came our entropy concept, and extension of the reasoning has brought useful applications in far-removed subjects like probability, number theory, information theory and language. by Mahadev Dutta PERHAPS THE PARADOX of entropy is that it has so long a history and yet so many remaining puzzles. We might say that the roots of its history go back even to primitive man. In his struggle for existence he used crude appliances and discovered fire. From closer in- timacy with more and more developed machines sprang mechanics, and from fire came the theory of heat; the two were quite separate sciences. Only the invention and impact of the steam engine led to final identifica- tion of heat and mechanical energy and thus to the first law of thermo- dynamics. Experiences with limita- tions in the heat-to-work conversion were formulated as the second law of thermodynamics; its convenient mathe- matical form is the entropy law. Suc- cessful applications in various sciences and in technology quickly proved its utility as a basic law of nature. But the inequality in the formula- tion of the law remains a scientific curiosity. Many attempts have been made and are still being made to analyze the law from many viewpoints: axiomatic, mechanical, statistical-me- chanical and purely statistical. Among the results of such studies are Bose- Einstein and Fermi-Dirac statistics, which are of fundamental significance throughout physics. In communication theory C. E. Shannon, in 1948, used entropy in a completely new context. He defined it for any random process. Out of this definition grew information theory. Shannon's general definition also opened new applications in other fields of human knowledge. In probability theory, it yields a method of statistical estimation. In logic it is used in the discussion of probabilistic inferences. In linguistics even the entropy of a language is defined. In mathematics the entropy concept has been intro- duced in abstract analysis, approxi- mation theory and number theory. Successful applications of the notion of entropy in different fields have es- tablished the usefulness and generality of the concept, which was originally introduced as the basic concept of thermodynamics and had its applica- tions mostly in some branches of physics, physical chemistry and me- chanical engineering. Background Many great scientists have contributed to this history. Through the work of Galileo Galilei (1564-1642) and Isaac Newton (1642-1727) mechanics took shape. In this form mechanics was widely accepted after the elegant for- mulation by Joseph Louis Lagrange (1716-1813) and the extensive calcu- lations of its various applications in celestial mechanics by Pierre-Simon Laplace (1749-1827). Gottfried Wil- helm Leibnitz (1646-1716) con- sidered different forms of mechanical energy and its conservation. Meanwhile experiences with fire and heat grew in bulk and were known as the "theory of heat," but until the middle of the last century the theory was mainly speculative. When James Watt made his steam engine in 1765, people began to think of the relations between thermal energy and mechanical energy. Just a little earlier, in 1747, some physiolo- gists, of whom Albert Haller was a pioneer, advocated the friction of blood flow as the source of body temperature. Then, after Watt, experiments of Ben- jamin Thompson (1798), who became Count Rumford, James Prescott Joule (1840) and their contemporaries led to recognition that heat is a form of energy and to the principle of con- servation of energy (Robert Julius Mayer's principle or the first law of thermodynamics). Mechanics joins heat With the unification of these two sepa- rate streams of human experience, the theory of heat had the proper scien- tific form, and a new branch of science had its origin; it is mainly concerned with transformations of heat into me- chanical work and the reverse and is known as "thermodynamics." Soon transformations of heat and work into and out of other forms of energy- chemical, electric, magnetic, electro- magnetic—were also studied. Before Mayer's clear formulation of the first law of thermodynamics (about Mahadev Dutta is professor of mathe- matics at the India Institute of Tech- nology. He received his doctorate at Calcutta University in 1950 and has written several books and many papers on number theory, topology, statistical physics and electrochemistry. This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to

A Hundred Years of Entropy

Embed Size (px)

DESCRIPTION

ariticle on entropy

Citation preview

Page 1: A Hundred Years of Entropy

A Hundred Years of EntropyFrom primitive machines and the discovery of fire came problems of mechanicsand heat. The steam engine brought them forcibly together. Out ofthe study of heat-to-work transformations that resulted, came our entropy concept,and extension of the reasoning has brought useful applications in far-removed subjectslike probability, number theory, information theory and language.

by Mahadev Dutta

PERHAPS THE PARADOX of entropy isthat it has so long a history and yet somany remaining puzzles. We mightsay that the roots of its history go backeven to primitive man. In his strugglefor existence he used crude appliancesand discovered fire. From closer in-timacy with more and more developedmachines sprang mechanics, and fromfire came the theory of heat; the twowere quite separate sciences.

Only the invention and impact ofthe steam engine led to final identifica-tion of heat and mechanical energyand thus to the first law of thermo-dynamics. Experiences with limita-tions in the heat-to-work conversionwere formulated as the second law ofthermodynamics; its convenient mathe-matical form is the entropy law. Suc-cessful applications in various sciencesand in technology quickly proved itsutility as a basic law of nature.

But the inequality in the formula-tion of the law remains a scientificcuriosity. Many attempts have beenmade and are still being made toanalyze the law from many viewpoints:axiomatic, mechanical, statistical-me-chanical and purely statistical. Amongthe results of such studies are Bose-Einstein and Fermi-Dirac statistics,which are of fundamental significancethroughout physics.

In communication theory C. E.Shannon, in 1948, used entropy in acompletely new context. He definedit for any random process. Out of thisdefinition grew information theory.Shannon's general definition alsoopened new applications in other fieldsof human knowledge. In probabilitytheory, it yields a method of statisticalestimation. In logic it is used in the

discussion of probabilistic inferences.In linguistics even the entropy of alanguage is defined. In mathematicsthe entropy concept has been intro-duced in abstract analysis, approxi-mation theory and number theory.

Successful applications of the notionof entropy in different fields have es-tablished the usefulness and generalityof the concept, which was originallyintroduced as the basic concept ofthermodynamics and had its applica-tions mostly in some branches ofphysics, physical chemistry and me-chanical engineering.

BackgroundMany great scientists have contributedto this history. Through the work ofGalileo Galilei (1564-1642) and IsaacNewton (1642-1727) mechanics tookshape. In this form mechanics waswidely accepted after the elegant for-mulation by Joseph Louis Lagrange(1716-1813) and the extensive calcu-lations of its various applications incelestial mechanics by Pierre-SimonLaplace (1749-1827). Gottfried Wil-helm Leibnitz (1646-1716) con-sidered different forms of mechanicalenergy and its conservation.

Meanwhile experiences with fire andheat grew in bulk and were known asthe "theory of heat," but until themiddle of the last century the theorywas mainly speculative.

When James Watt made his steamengine in 1765, people began tothink of the relations between thermalenergy and mechanical energy. Justa little earlier, in 1747, some physiolo-gists, of whom Albert Haller was apioneer, advocated the friction of bloodflow as the source of body temperature.

Then, after Watt, experiments of Ben-jamin Thompson (1798), who becameCount Rumford, James Prescott Joule(1840) and their contemporaries ledto recognition that heat is a form ofenergy and to the principle of con-servation of energy (Robert JuliusMayer's principle or the first law ofthermodynamics).

Mechanics joins heatWith the unification of these two sepa-rate streams of human experience, thetheory of heat had the proper scien-tific form, and a new branch of sciencehad its origin; it is mainly concernedwith transformations of heat into me-chanical work and the reverse and isknown as "thermodynamics." Soontransformations of heat and work intoand out of other forms of energy-chemical, electric, magnetic, electro-magnetic—were also studied.

Before Mayer's clear formulation ofthe first law of thermodynamics (about

Mahadev Dutta is professor of mathe-matics at the India Institute of Tech-nology. He received his doctorate atCalcutta University in 1950 and haswritten several books and many paperson number theory, topology, statisticalphysics and electrochemistry.

PHYSICS TODAY • JANUARY 1968 • 75 This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded toIP: 129.133.183.169 On: Wed, 21 Jan 2015 22:36:08

Page 2: A Hundred Years of Entropy

1842) and its acceptance, men hadexperienced the limitations in transfor-mations of heat into work. NicolasLeonard Sadi Carnot (1828) noticedthese clearly in his investigations oncycles representing the working of heatengines.1 These limitations necessi-tated the formulation of the secondlaw of the thermodynamics. As MaxPlanck tells us in a book of his,2 itwas formulated by Rudolf Clausius(1850), William Thomson, Lord Kelvin(1851), and Wilhelm Ostwald inde-pendently in different languages. It iseasily seen that these different formula-tions represent essentially the samefact. According to Clausius,3 "A trans-formation of which the final result is totransform into work heat extractedfrom a source which is at the sametemperature throughout is impossible."Ostwald's principle is, "Perpetual mo-tion of the second kind (which cantransform heat into work without anylimitation) is impossible." Anotherversion of the law, very convenient formathematical development, is ex-pressed in terms of entropy, which wasfirst introduced by Clausius4 in 1865(in 1855, according to Guggenheim5)in his discussions of heat cycles.

The entropy change from state A tostate B is generally defined6 as

where d'Q is the heat absorbed by thesystem at the temperature T and thepath of integration is taken as reversi-ble, that is, through successive states ofequilibrium as ideally visualized in in-finitely slow processes of transforma-tion. On account of the first law, inorder that the definition be independ-ent of path, the following must be anexact differential

d'Q dU + pdV-, or

(p and V are pressure and volume;X/s and x/s are other quantities simi-lar to p and V). Generally, in text-books, the absolute temperature is in-troduced as the integrating factor ofthe Pfaffian for d'Q representing theconservation of energy and the entropyas the integral. Definition also sug-gests that in cases of irreversible trans-

FEYNMAN'S PERPETUAL-MOTION MACHINE. Thermal motion of air moleculeswould make the windmill turn in both directions, were it not for the ratchet-and-pawl.The windmill should thus turn slowly in one direction, perhaps lifting a bug. Air mole-cules responsible for the rotation would slow down and the air surrounding the wind-mill would cool. Feynman points out, though, that the pawl is also subject to randomtemperature motion, which may lift it sufficiently to permit the ratchet to turn back,ward every so often. Furthermore, the pawl acts not just as a stopping device butalso as a mechanism able to drive the axle in the "forbidden" direction. Feynmanshows that work is performed only if the temperature of the air surrounding the wind-mill differs from that of the pawl and its environment, and that the second law ofthermodynamics is quantitatively obeyed. (From J. Waser, Basic Chemical Thermo-dynamics, W.A. Benjamin, 1966, and R.P. Feynman, R.B. Leighton, M. Sands, TheFeynman Lectures, Addison-Wesley, 1964.)

formations, the change of entropy is tobe calculated along a suitable revers-ible path from A to B. At this point,there are some conceptual difficultieswhich, even at the present day, requiresome careful considerations. The en-tropy principle (the second law) is ex-pressed, "The entropy of the universe(or an isolated system) can not de-crease

A5<0Thus the second law shows the di-

rection of thermodynamic changes, andin doing so it takes on a unique impor-tance in natural philosophy. The no-tion of unidirectional flow of time isthe basis of human experience of thephysical world and of science. It iscompatible with biological cycles:birth and germination, growth anddeath and decay. But most of thefundamental laws of physics remainunchanged when time t is changed to—t; for these laws the notion of uni-directional time flow is not essential.Only the unidirectional change of en-tropy of the universal system or acompletely isolated one provides aplausible ground for the concept ofunidirectional flow of time.

According to Henri Poincare7

(1854-1912), the entropy principle, as

far as its form is concerned, has aunique distinction from all other lawsof physics. Usually laws of nature areexpressed in terms of equality or co-variance. But in the entropy principle(and also in other versions of the sec-ond law) it is done by inequality.This peculiarity has been stirring thecuriosity of a large number of scien-tists, and investigations have beenmade to put forward some sort of ex-planation along different lines andfrom different standpoints.

In classical thermodynamicsWhen the laws of thermodynamics areaccepted as hypotheses and the con-sequences of these laws are studied,the subject is referred to as "classicalthermodynamics/' From this studyone gets the laws of phase transforma-tions, those of chemical reactions, anelegant theory of dilute solutions andalso many other results useful in en-gineering. In a simple and straight-forward way the results of thermody-namics are extended to include allforms of energy and problems of sur-face tension, elastic and fluid media,electric and magnetic phenomena inwhich heat is evolved or absorbed.

In this connection, it is to be notedthat entropy, as usually defined, con-

76 • JANUARY 1968 • PHYSICS TODAY This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded toIP: 129.133.183.169 On: Wed, 21 Jan 2015 22:36:08

Page 3: A Hundred Years of Entropy

tains an arbitrary constant. For ap-plication in mechanical engineeringand similar other subjects, entropy isevaluated with reference to a standardposition. But for applications in somesubtle problems like evaluation ofchemical equilibrium or ionization,knowledge of the entropy constant isimportant. It is evaluated with thehelp of the third law,6 which, inPlanck's terms can be expressed, "Asthe absolute temperature tends to zero,the entropy of a pure substance tendsto zero."

In axiomatic thermodynamicsQuite close to the classical develop-ment are two elegant axiomatic formu-lations of entropy and the entropyprinciple, one due to Constantine Cara-theodory8 and the other due to R.Giles.9 In the first the concept of heatis introduced as the secondary concept.The principle of conservation of energyis accepted but formulated withoutusing the concept of heat for adiabaticchanges. Adiabatic enclosures oradiabatic changes of coordinates aretaken as undefined concepts or factsof experience. Then a postulate, con-sidered very plausible and named"Caratheodory's Principle," is ex-pressed: "In every neighborhood ofa point P in the space of states, thereexists a point inaccessible from Padiabatically." From this principleand from a general discussion of thePfaffian, entropy and absolute tem-perature are introduced and the notionof heat is formed. After Caratheod-ory, discussions along the line havebeen made by Max Born10 (1921),Alfred Lande11 (1926) and Tati-ana Ehrenfest-Afanassjawa12 (1925).After nearly 30 years, the interest inthis approach has been revived, andseveral workers are now trying toexplore the different aspects of thistheory.

This theory is entirely a "local" oneas opposed to a "global" one. Whenin the investigations of physical or geo-metrical problems, differential calculusand differential equations are used,considerations of local properties or, asa German would say, "properties atsmall," play the most important role.Such a theory is referred to as a "local"theory. On the other hand, wheretechniques of integral calculus, mea-sure theory or set functions are used,

considerations of global problems or"properties at large" become promi-nent. Such theory is "global" theory.

In 1964 Giles9 started with someprimitive, undefined concepts, namely,of an operation + and a relation —»,among states of a thermodynamic sys-tem, and internal states with some ele-mentary axioms, and built up a formal,mathematical theory. In this theorythe entropy function is constructed asa real-valued functional of states satis-fying some sort of ordering and thecondition that it is zero for a suitablyintroduced mechanical state. Giles'stheory is global.

A mechanical approachThe first attempt to construct a func-tion behaving like entropy from amainly mechanistic standpoint is dueto Ludwig Boltzmann13 and Clau-sius.14 For a single mechanical sys-tem executing periodic motion, a func-tion behaving like entropy is con-structed by using mechanical conceptsand equations. One uses the principleof adiabatic invariance, formulated en-tirely in the terminology of mechanicsbut not strictly in the framework ofclassical mechanics. Later J. J.Thomson15 (1884) also put forwarda mechanical explanation of entropythat is closely similar to that ofBoltzmann and Clausius. Writing in1908, Poincare16 told us that Her-mann L. F. von Helmholtz (1821-1894) also made similar efforts. PaulEhrenfest,17 while proposing its appli-cability in some quantum problems, re-ported briefly some aspects of thetheory of Boltzmann and Clausius inan appendix. I have also given a briefreport of this approach in a book tobe published soon.18 But most sci-entists, even Boltzmann himself, werenot satisfied with this theory.

In statistical mechanicsBoltzmann19 in 1877 or a bit earlierwas the first to put forward an inter-pretation of entropy using some sta-tistical arguments along with mechani-cal reasoning for a mechanical model.Boltzmann's H theorem demonstratesthe existence of a function, — 2/n logfn, behaving like an entropy. He alsoproposed a simple relation betweenentropy and probability referred to asthe "Boltzmann principle." The pres-ent-day form of the principle connect-

ing entropy with probability (strictly,thermodynamic probability) and itsplausibility are mainly due to Planck,20

Ehrenfest,21 and others. J. WillardGibbs advocated the probabilistic in-terpretation as early as 1876.° Gibbs'sfamous book proposing an elegant for-mulation of statistical mechanics waspublished in 1902.22 In this he estab-lished the great similarity between theaverage value of the index of a prob-ability distribution and the entropy.In these theories, entropy is conceivedmainly as an ensemble property, astatistical concept. These statisticallines of investigation had the widestsupport amongst physicists includingthe pioneers like Albert Einstein, PaulA. M. Dirac, and others.

The statistical arguments used in theabove theories lead to the idea of fluc-tuations of physical quantities. Re-sults about fluctuations that one getsfrom the arguments, have been veri-fied by experiments designed in otherbranches of physics—even quantita-tively. Satyendra Nath Bose's famouswork,23 in which the notion of weightof a state rather than that of the distri-bution of particles was introduced intothe usual method based on thermody-namic probability, not only yieldedBose statistics for photons but alsostimulated contributions of very signifi-cant results like Bose-Einstein statis-tics24 and Fermi-Dirac statistics.25 Italso revived an earnest search for thereal significance of basic concepts likeentropy. Einstein,26 Planck,27 ErwinSchrodinger28 and others made criticalscrutiny of the entire standpoint of thesubject. At present Bose and Fermistatistics are considered to reveal veryfundamental properties of elementaryparticles. These results have been ac-cepted as more significant and funda-mental than the statistical mechanicsitself.

During the years from 1924 to 1926,Charles G. Darwin and R. H. Fowler29

proposed a slightly different approachto thermodynamic problems. Ac-cording to their proposal, all ther-modynamic quantities are to be takenas average quantities. Full discussionof the method and its applications canbe seen in the well known treatise ofFowler.30 The elegant book of A. I.Khinchin31 shows that Fowler'smethod is related to the central-limittheorem of the theory of probability,

PHYSICS TODAY • JANUARY 1968 • 77 This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:129.133.183.169 On: Wed, 21 Jan 2015 22:36:08

Page 4: A Hundred Years of Entropy

ORDERLY ARRANGEMENT of two groups of balls is destroyed by stirring in aclockwise direction. The entropy of the system is increased. Stirring in counterclock-wise direction does not increase the order; in fact, the entropy of the system increasesmore, in accordance with the entropy principle of the second law of thermodynamics.

and A. H. Koppe32 deduced the re-sults of Darwin and Fowler by appli-cation of the Fourier transform and itsapproximate evaluations.

In communication theoryIn the mathematical theory of com-munication Shannon33 proposed ameasure of information, choice and un-certainty associated with a randomevent as

H = - 2P, log Pt

where Fi is the probability of the ithoutcome of the random event. Usingthe terminology of statistical mechan-ics, he called his H an "entropy." It isalso simply related to the capacity ofthe channel. Entropy plays a veryimportant role in information theory.It provides a measure of informationand thus has access to differentbranches of human thought.34

Probability, statistics, etc.For the introduction of entropy intocommunication theory, Shannon rep-resented a discrete information sourceas a random process and used theterminology of this process freely.35

So Shannon's definition of entropy hasa direct and close relation with prob-ability theory. Entropy is used36 asan important concept in the logicaldiscussions of probable inferences. Itis also seen that different problems ofthe theory of statistics can be solvedby introducing the notion of entropy

or the information, which is defined asthe negative of entropy.37

Shannon even calculated the en-tropy of the English language.34 En-tropies of some other languages havealso been calculated. A. N. Kolmo-groff has associated the concept ofentropy with abstract sets in abstractanalysis. A comprehensive discussionof entropy in different branches ofabstract analysis is in a recent lectureof G. G. Lorentz.38

Essentially statistical approachSince 1951, I have been developing anessentially statistical approach to ther-modynamics.39 In this developmentdiscussions of thermodynamics havebeen based on the theory of estimationof statistics instead of the usual me-chanical notions. I look upon mea-sured values of some fundamental en-tities as sample values. The ex-ponential distribution is taken as thedistribution of thermodynamic statesfrom arguments similar to the usualBaysian arguments39 or as conse-quences of the additivity of funda-mental entities.40

Then, I estimate the parameters bya principle practically the same asRonald A. Fisher's principle of maxi-mum likelihood. With this distribu-tion, I calculate mean values and writethe equation for conservation of thefundamental entities. The integral ofthis equation yields the usual thermo-dynamic entropy and different inte-

grating factors related to importantthermodynamic quantities. Here theentropy is obtained as a constant ktimes the logarithm of the Fisher like-lihood function with the parametersreplaced by the maximum-likelihoodestimates. For detailed discussion ofthis method you can consult mybook.18

Since 1957, E. T. Jaynes42 has beendeveloping a purely probabilistic ap-proach based on the method of maxi-mal entropy estimation of informationtheory. He has taken entropy, as de-fined by Shannon, to be the basic con-cept to start with. All other thermo-dynamic quantities, in Schrodinger'sway, he has identified by analogy withparameters and expressions that ap-pear in his discussion.

B. Mandelbrot,43 since 1959, hasbeen studying the parallel develop-ment of physical statistics and mathe-matical statistics. According to himalmost all concepts and propositions ofphysical statistics are essentially simi-lar to the corresponding concepts ofthe others. They only differ in nota-tion and terminology.

I have shown44 that the method ofmaximum entropy estimation, that ofexponential distribution in which pa-rameters are estimated by the principleof maximum likelihood and that basedon Gauss's principle of the arithmeticmean45 are equivalent. This resultclearly shows how my method andJaynes's are related and also throwslight on the interrelation between themethod of Darwin and Fowler andother methods of statistical mechanics.Further I have also shown that for anergodic population, that is, for a popu-lation in which Bernoulli's theorem isvalid, l/N times the logarithm of thelikelihood function converges in prob-ability to the entropy function of in-formation theory. If this definition ofentropy is accepted, the notion of en-tropy can be introduced for every sta-tistical population. Recently46 fromthis discussion I have established theequivalence of the principal methodsof statistical mechanics.

In number theoryI have already mentioned many appli-cations of entropy in different branchesof human knowledge. But these donot appear to be complete. Moreover,the possibilities of the application of

78 JANUARY 1968 • PHYSICS TODAY This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to

IP: 129.133.183.169 On: Wed, 21 Jan 2015 22:36:08

Page 5: A Hundred Years of Entropy

entropy to newer fields are not yetcompletely exhausted. Let me men-tion a simple application of a new typeof problem. In the theory of num-bers, in the problem of decimal repre-sentation of numbers, the frequencyof occurrence of a particular digit hasbeen studied with great interest.47

(We might better say the "probability"in the sense used by Venn, Rich-ard von Mises, and Harold Cram-mers.) A number is said to be "sim-ply normal" if the frequency of everydigit is equal to 1/r, r being the scaleof notation. A number is said to be"normal," if it is simply normal whenthe scale of notation is any power of r.From our above discussion, it appearsthat the notion of entropy can be in-troduced easily and by this introduc-tion not only all definitions can be di-rectly introduced in terms of entropy,but also many propositions can be de-

duced easily. In a recent work,48 Ihave done this, and some results havebeen proved very easily. Also onecan introduce some new notions (de-gree of normalcy, etc.). With com-puters this method may provide an-other means of verifying the principleof entropy maximization.

And in generalShannon's definition of entropy has itsorigin in Boltzmann's H theorem.19

Now one can see easily that thistheorem can be demonstrated for aclass of H function where in the defini-tion of H, Pi is replaced by p^ , a beingreal and positive. It appears to beinteresting to study the consequencesof taking Ha for H in the discussionsof thermodynamics, information theoryand other branches of knowledgewhere entropy has been used. Thistype of function has already been con-

sidered in the study of the axiomaticbasis of information theory.

The general notion of entropy, asintroduced by Shannon, has proved itssignificance in a vast field of humanknowledge. But, it appears that thefull significance of this concept is notyet fully explored.

The author expresses his hearty thanks toNational Professor Satyendra Nath Bose ofIndia, for suggesting that I write a com-prehensive article on a hundred years ofentropy, for his constant interest and en-couragement and for the support fromhis research scheme. I also thank Bow-doin College for inviting me to deliverthe lecture on which this article is hosedas the third lecture in the Tollman Foun-dation Lecture Series of 1966-67. Thanksare also due to Dan E. Christie and My-ron A. Jeppesen of Bowdoin for their con-structive criticism during the final prep-aration of the manuscript.

References1. M. W. Zemansky, Heat and Thermo-

dynamics, McGraw-Hill, New York(1943).

2. M. Planck, Treatise on Thermody-namics, (3rd edition) Longmans,Green, London (1927).

3. E. Fermi, Thermodynamics, (Reprintof 1937 edition) Dover, New York(1956).

4. H. Zeise, Thermodynamik, Hirzel,Leipzig (1944).

5. E. A. Guggenheim, Thermodynamics,North-Holland, Amsterdam (1950).

6. P. S. Epstein, Textbook of Thermo-dynamics, Wiley, New York (1937).

7. H. Poincare, Science and Hypothesis,(Reprint of 1905 edition) Dover,New York (1953).

8. C. Caratheodory, Math. Ann. 67, 355(1909).

9. R. Giles, Mathematical Foundationsof Thermodynamics, Pergamon Press,Oxford (1964).

10. M. Born, Phys. Z. 22, 218, 249, 282(1921).

11. A. Lande, p. 281 in Handbuch derPhysik, Vol. 9, chap. 4, Springer-Verlag, Berlin (1926).

12. T. Ehrenfest-Afanassjewa, Z. Physik93,933 (1925).

13. L. Boltzmann, Wein. Ber. 53, 195(1866); Pogg. Ann. 143, 211 (1871).

14. R. Clausius, Pogg. Ann. 141, 124(1870).

15. J. J. Thompson, Applications of Dy-namics to Physics and Chemistry,Macmillan, London (1884); R. B.Lindsay, H. Margenau, Foundationsof Physics, (Reprint of 1936 edition)Dover, New York (1957).

16. H. Poincare, Thermodynamique,Gauthier-Villars, Paris (1908).

17. P. Ehrenfest, Phys. Z. 15, 657

(1914); Ann. Physik 51, 327 (1916).18. M. Dutta, Statistical Physics (Foun-

dation), World Press, Calcutta (inpress).

19. L. Boltzmann, Vorlesungen iiber Gas-theorie, (3rd edition) Barth, Leip-zig (1927).

20. M. Planck, Theory of Heat Radiation,(Translation of 1914 edition) Dover,New York (1959).

21. P. Ehrenfest, Phys. Z. 15, 657 (1914).22. J. W. Gibbs, Elementary Principles

of Statistical Mechanics, Yale U.Press, New Haven (1902).

23. S. N. Bose, Z. Physik 26, 178 (1929);27,284 (1924).

24. A. Einstein, Sitzungsberichte Pruss.Acad. Wiss. p. 261 (1924); p. 115(1925).

25. E. Fermi, Z. Physik 48, 868 (1926);P.A.M. Dirac, Proc. Roy. Soc. A112,6660 (1926).

26. A. Einstein, Sitzungsberichte Pruss.Acad. Wiss. p. 18 (1925).

27. M. Planck, Sitzungsberichte Pruss.Acad. Wiss. p. 49, 442 (1925).

28. E. Schrodinger, SitzungsberichtePruss. Acad. Wiss. p. 434 (1925).

29. C. G. Darwin, R. H. Fowler, Phil.Mag. 44, 420, 823 (1922).

30. R. H. Fowler, Statistical Mechanics,Cambridge U. Press, London (1933).

31. A. I. Khinchin, Statistical Mechanics,(Translation of 1937 edition) Dover,New York (1949).

32. A. H. Koppe, Statistical Thermody-namik, Hirzel, Leipzig (1942).

33. C. E. Shannon, Bell System Tech. J.27,379,623 (1948).

34. L. Brillouin, Science and InformationTheory, (2nd edition) AcademicPress, New York (1936).

35. A. I. khinchin, Mathematical Founda-

tions of Information Theory, (Trans-lation of 1957 edition) Dover, NewYork (1957).

36. R. T. Cox, The Algebra of ProbableInference, Johns Hopkins Press, Bal-timore (1961).

37. S. Kullback, Information Theory andStatistics, Wiley, New York (1959).

38. G. G. Lorentz, Bull. Am. Math. Soc.72,903(1966).

39. M. Dutta, Proc. Nat. Inst. Sci. (In-dia) 19, 1 (1953); Bull Internat.Stat. Inst. 32, 28 (1953); Proc. Na tInst. Sci (India) 22, 273 (1956); p.291 in Proceedings of the SummerInstitute of Theoretical Physics atMussouri, India (1959); Jnana OhBijarana (in Bengali) 12,737 (1960);Z. Physik. Chem. (Leipzig) 228, 380(1965).

40. M. Dutta, Indian J. Phys. 40, 422(1966).

41. R. A. Fisher, Statistical Theory of Es-timation, California U. Press, Berke-ley (1938).

42. E. T. Jaynes, Phys. Rev. 106, 20, 107,127 (1957); Statistical Physics, W.A. Benjamin, New York (1963).

43. B. Mandelbrot, Ann. Math. Stat. 33,102 (1962).

44. M. Dutta, Sankhya A28, 391 (1966).45. M. Dutta, S. Pal, Introduction to the

Mathematical Theory of Probabilityand Statistics, World Press, Calcutta(1963).

46. M. Dutta, "Equivalence of Methodsof Statistical Mechanics" (submittedto Indian J. Phys.).

47. G. H. Hardy, E. M. Wright, An In-troduction to the Theory of Numbers,Clarendon Press, Oxford (1959).

48. M. Dutta, M. Sen, Proc. Nat. Inst.Sci. (India) A33, 66 (1967). •

PHYSICS TODAY • JANUARY 1968 • 79 This article is copyrighted as indicated in the article. Reuse of AIP content is subject to the terms at: http://scitation.aip.org/termsconditions. Downloaded to IP:129.133.183.169 On: Wed, 21 Jan 2015 22:36:08