27
Chapter 1: Drums That Talk Captain Allen discovered the talking drums of Africa, a way of relaying messages, in 1841. Around the same time F. B. Morse was developing Morse code. (19) African languages are tonal and are not do not correspond to a representational alphabet. The meaning of their verbal language is conveyed as much by tone (rising and falling of inflections etc.) as it is by distinct words (23). African drum language takes this to the extreme and conveys meaning in tone (24). In order to reduce confusion, extra phrases are added to each short word, (much like the NATO phonetic alphabet) (25-26). Verbosity aids contextualisation. The chapter sets up context and redundancy as key to understanding Gleick’s version of Information. Chapter 2: The Persistence of the Word Some technologies have internalized so much that we have forgotten, for instance the technology of writing (28). Walter J. Ong argues that history began with writing as it gives us a true sense of the past and our relation to it (28). Ong argues that speech is not a technology (but a truer sense of our internal thoughts/feelings) and that writing

Gleick Information

Embed Size (px)

Citation preview

Page 1: Gleick Information

Chapter 1: Drums That Talk

Captain Allen discovered the talking drums of Africa, a way of relaying messages, in 1841. Around the same time F. B. Morse was developing Morse code. (19) African languages are tonal and are not do not correspond to a representational alphabet. The meaning of their verbal language is conveyed as much by tone (rising and falling of inflections etc.) as it is by distinct words (23). African drum language takes this to the extreme and conveys meaning in tone (24). In order to reduce confusion, extra phrases are added to each short word, (much like the NATO phonetic alphabet) (25-26). Verbosity aids contextualisation.

The chapter sets up context and redundancy as key to understanding Gleick’s version of Information.

Chapter 2: The Persistence of the Word

Some technologies have internalized so much that we have forgotten, for instance the technology of writing (28). Walter J. Ong argues that history began with writing as it gives us a true sense of the past and our relation to it (28). Ong argues that speech is not a technology (but a truer sense of our internal thoughts/feelings) and that writing organises thought in a technological (non-natural) way (30). Gleick quotes Plato’s Socrates and his argument in (Phaedrus) that the invention of writing “will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory” and that writing is not “an elixir of memory, but of reminding [and] the appearance of wisdom, not true wisdom” (30). However, writing does allow the dead to speak to the living and the living to the unborn.

Gleick compares Chinese script to alphabetic writing arguing they are both efficient and inefficient in different ways. The symbols method matches on to things and concepts, therefore a huge amount of symbols are needed.

Page 2: Gleick Information

Symbols (reiterative/Expansive) can be repeated or joined to create new meanings (e.g. Tree + tree + tree = forest) This adds to the languages complexity and the number of dialects. Alphabet writing, on the other hand, is reductive. Each letter is meaningless on its own but is easier to learn and use (leading to its wider adoption around the globe) (35). The Greeks had (oral) literature before writing and the transcription of these tales demonstrated how the differences between oral and written story telling. The lists, rhyming and repetition demonstrate techniques to make them easier to remember and their structure demonstrates their changing nature through multiple retellings. Alternatively, “persistence of writing made it possible to impose structure on what was known about the world and, then, on what was known about knowing” (36). Orallity is based on accretion, a temporality forcing connections between present thoughts and past memories. Writing allows for categories. Different ideas existing contemporaneously. Writing (DECONTEXTUALIZACIÓN) detaches information from the speaker, situation and experience (39). “It is a twisting journey from things to words, from words to categories, from categories to metaphor and logic” (39). Before writing, concepts such asdefine did not exist. Due to the written word, language refers not only to the things in the word but becomes once removed. Language refers to the written word which refers to things: as if some objective dictionary definition exists of everything we say. Gleick then goes on to discuss Cuneiform, its relationship to other languages and how it was useful for “making copies” of numbers and as a memory tool. Gleick turns to Marshall McLuhan and his argument that we are drawn together in a “tribal mesh” (49) and that communication is more effective if numerous senses are engaged.

The chapter attempts to defamiliarise us from the tools (language, writing argument) Gleick uses in the rest of the book and opens up questions (rather than definite stand-points of the relationship of writing to thought, the world and speech) which thinkers later on grapple with.

Chapter 3: Two Wordbooks

Page 3: Gleick Information

Gleick describes Robert Cawdrey’s book “A Table Alphabeticall…” a book which attempts to impose structure on what Cawdrey would have seen as an informational explosion (52). Spelling were inconsistent as “language did not function s a storehouse of words, from which users could summon the correct items, preformed… words were fugitive, on the fly, expected to vanish again thereafter” (53) Language was flexible. Words also entered the English language due to commerce. Words reflected social distinctions for instance “English peasants of the lower classes continued to breed cow, pigs, and oxen (Germanic words), but in the second millennium the upper classes dined on beef, pork and mutton (French). Cawdrey had no intention of listing all the words in use, only the “hard ones” (56). He ordered it in alphabetic order (which had to be explained to readers). This lack of instinctual context “forced the user to detach information from meaning; to treat words strictly as character strings; to focus abstractly on the configurations of the word” (58). Cawdrey’s definitions of words lead to subjective interpretation and opened up an understanding of how “knowledge was in flux” (62). Gleick relates this to the Oxford English Dictionary (OED) and the ways in which it crystallises meaning and often creates a feedback loop between words and meaning (69). The OED attempts to define words used in general speech, the periphery, localised use of language is left undefined. Related to this is the notion of mondegreens (where lyrics are misheard, often as less plausible versions than the original). A word which could only start developing after the notion of lyrics (lyrics: meaning the words of a particular song did not exist until the 19th century). Still, the phrase could not emerge until certain technologies whereby the same version of a song could be heard over and over. Therefore, the word mondegreen only refers to a particular and specific set of culture interrelations. Technologies change understanding, thought and meaning in often very indirect ways.

Gleick ends this chapter by hinting towards our modern situation:

“Like the printing press, the telegraph, and the telephone before it, the Internet is transforming the language simply by transmitting information differently. What makes cyberspace different from all previous information technologies is its intermixing of scales from the largest to the smallest without

Page 4: Gleick Information

prejudice, broadcasting to the millions, narrowcasting to groups, instant messaging one to one.” (77)

Chapter 4: To Throw The Powers Of Thought Into Wheel-Work

This chapter focuses on Charles Babbage and begins with a list of reasons why he could be considered as a genius living outside of his time (one of which is his idea to standardise postal rates) (79). He spend most of his life developing The Difference Engine (for which he was paid by the government £1,500 from 1823 to 1842) and never completed in his lifetime. His machine work on a principle related to other machines of the time (looms, forges, naileries and glassworks) but his commodity was numbers (80). Numbers would go in, other numbers would come out. In the 19th Century, number tables were used in numerous contexts. Elie de Joncourt’s book of triangular numbers is a notable example. Logarithms (and their bases) had significant value to traders, financiers and other trade (86). These books demonstrated that “Knowledge has a value and a discovery cost, each to be counted and weighed” (87). In England in the 19th Century, and particularly Cambridge, mathematics was stagnating (89). In England the subject’s emphasis was on Newton and any work corresponding to, or influenced by, Gottfried Wilhelm Leibniz was outlawed. Both thinkers based their work on the same calculus but provided alternative systems of notation. Babbage became fluent in both (89). Babbage saw language as a “leaky sieve” for meaning and understanding and therefore desired to create a universal language or system of symbols which would free philosophical questioning from “local idiosyncrasy’s and imperfections” (90). Babbage was employed to produce logarithm tables for the Cambridge Astronomical Society a process of which he said “I wish to God these calculations had been executed by steam” (92). These two desires underpin much of his thought throughout his life. Before Babbage the only “machines” used for calculations were essentially versions of the abacus (Blaise Pascal’s adding machine in 1642 being one and Leibniz’s improved version being

Page 5: Gleick Information

another). These were passive registers of memory states, they were not, to Babbage, automatic (93). Babbage’s way of thinking about numbers and how the could be processed by a machine was by way of highlighting differences. Focusing on the “calculus of finite differences” Babbage saw a future where machines could translate numbers based on the relationships between different numbers and the structure into which they fall. The machine would not process the numbers in a way which would give the individual numbers meaning but would focus on their relationship or differences: hence why he called his ideal machine the “Difference Engine” (95). The difference engine is based upon the understanding of number sequences. The difference between numbers in the sequence can be calculated and then the difference between the differences can be calculated, and the process repeated, until a stable pattern emerges. The difference engine would run this process in reverse and “generate sequences of numbers by a cascade of additions” (96-97). This process would not only reduce the possibility for human error but also speed up the process of making number tables tremendously. Gleick then goes on to discuss Ada Byron (polymath and daughter of the poet Lord Byron) and her personal and mathematical relationship with Babbage. Later in his life Babbage moved his attention for his difference engine to a machine called the Analytical Engine. This would surpass the limitations which the difference engine had of computing every sort of number. Moving away from the mechanical design in which each physical procedure would represent one number the analytical machine’s cogs and wheels would also be able to stand in for variables (instead of numbers). Babbage’s analytical engine “did not just calculate; it performed operations … on any process which alters the mutual relation of two or more things” (115-6). The programming of this machine opened up a number of questions concerning information (conceptually) and the discussions between Babbage and Ada Lovelace (she is more commonly known as Lovelace rather than Byron), extended a new understanding of information and knowledge. Their understanding of these problems fed into cryptography which will be discussed later.

This chapter ends with a series of quotations for Babbage reinforcing that “knowledge is itself the generator of physical force” and that “It is the science of calculation - which becomes continually more necessary at each step of our progress, and which must ultimately govern the whole of the applications of science to the arts of life.” (124) Pattern, the changeable nature of meaning

Page 6: Gleick Information

and the physicality are all raised and continue to be central issues in Gleick’s study.

Chapter 5: A Nervous system for the Earth

Gleick begins the chapter with a comparison of a range of different people, who throughout history compare machinery, particularly wires and electricity to the human body and our nervous system. This highlights a couple of recurring metaphors but also the importance of public misunderstanding in the expansion of technologies. He then moves on to discuss “the language of telegraph signs” (132) the large towers which covered large parts of several countries which transferred simple messages by the changing of a metal rig to provide information (in a similar way to semaphore flags) (136). The discussion of how these messages were transferred leads onto a section devoted to Samuel F. B. Morse and Morse code (142). After this Gleick moves on to discuss the electrical telegraph (more familiar to us today than its non electrical predecessor). The electrical telegraph changed people perception of information and messages as divorced for their physical containers. “A message had seemed to be a physical object. That was always an illusion; now people needed consciously to divorce their conception of the message from the paper on which it was written” (151). Morse code also changed a perception of a system of reference. The “Morse Telegraphic alphabet … was not an alphabet. It did not represent sounds by signs … it was a meta-alphabet” (152). This lead to private codes (mhii = my health is improving or ymir = your message is received) to cut on costs and send more information per message. This lead tot he publication of numbers code books to enable private, secret and economical messages between people. The variety of different methods employed highlighted a disconnect between meaning, message and medium (156). Gleick then relates these wider public issues to the state of philosophy in the 19th and 20th Century, particularly the focus on formal logic. Referencing Augustus De Morgan and George Boole and then

Page 7: Gleick Information

bringing the debate into a more contemporary context by linking their understanding of logic to the paradox’s of Bertrand Russell and the syllogisms of Lewis Carroll.

Chapter 6: New Wires, New Logic

Gleick returns to Claude Shannon and details his work at Massachusetts Institute of Technology (171). There he worked under the wing of Vannevar Bush, working on the “Differential Analyzer” which the public media was calling a “mechanical brain” or “thinking machine”. Gleick details how this machine was related to Babbage’s Difference Engine and Analytical Engine but due to Bush’s lack of knowledge of Babbage’s life and work they cannot be considered as sharing the same lineage. Gleick then returns to logic and the history of philosophy by returning to Russell and introducing Alfred North Whitehead (178). He then introduces the work of Gödel whose work concerns incompleteness. “Gödel showed that a consistent formal system must be incomplete, no complete and consistent system can exist” (184) This meant “that mathematics could never be proved free of self-contradiction” (185). Gleick then turns to the Bell labs (188) and the importance of the telephone. “Where the telegraph dealt in facts and numbers, the telephone appealed to emotions” (189). Gleick then starts to outline the beginnings of “information theory” (200) and the way certain theorists attempted to define information outside of a human understanding of meaning.

Chapter 7: Information Theory

Page 8: Gleick Information

Gleick starts by outlining the relationship between Claude Shannon and Alan Turing (who “met daily at teatime in the Bell Labs cafeteria” (204)). The discussed their visions of the future for thinking machines. Using thought experiments Turing asked “are all numbers computable?” (207). He worked through this question in regards to his thought experiment by building a machine in his mind. The essential aspects were “tape, symbols and states” (208). The machine would read the symbols on the tape and adjust its state accordingly. “The machine can see just one symbol at a time, but in effect use parts of the tape to store information temporarily” (210).The machine would then print out the result. “Anything computable can be computed by this machine” (210). He named this thought experiment machine U, for universal. Theoretically however complex computers get, anything they can compute, U can compute as well. What Turing showed however was that some numbers “are uncomputable” (211). The commentary then turns to cryptograph, which both men were occupied with and how these experiments in information related to coding, deciphering and decoding messages. In this way Shannon relates patterns to redundancy. In language redundancy serves to aid understanding (just like in the African Drumming). In cryptography redundancy “contributes no information” (216) like a u after a q. This redundancy would make codes easier to crack and it would give the code a sense of context. “Shannon estimated that English has redundancy of about 50%” (217) and that English could be shorted by half with no loss of information. At this point Gleick outlines Shannon’s definition of “Information Theory” which proved important for many thinkers and fields of thought after him.

Shannon defines Information as separate to meaning and argues that “Information is uncertainty, surprise, difficulty, and entropy” (219)

Gleick outlines these terms as follows:

“Information is closely associated with uncertainty.” Uncertainty, in turn, can be measured by counting the number of possible messages. If only one message is possible, there is no uncertainty and thus no information.

Page 9: Gleick Information

Some messages may be likelier than others, and information implies surprise. Surprise is a way of talking about probabilities. If the letter following t (in English) is h, not so much information is conveyed, because the probability of h was relatively high.

“What is significant is the difficulty in transmitting the message from one point to another.” Perhaps this seemed backward, or tautological, like defining mass in terms of the force needed to move an object. But then, mass can be defined that way.

Information is entropy. This was the strangest and most powerful notion of all. Entropy—already a difficult and poorly understood concept—is a measure of disorder in thermodynamics, the science of heat and energy.

Where a common definition of communication might be trying to “make oneself understood” Shannon described it as “reproducing at one point either exactly or approximately a message selected at another point” (221) Point here refers to space and time: “information storage … counts as a communication” (222). Communication encompasses the information sent by a person or machine, the transmitter which operates on or encodes the message in some way, the channel or medium for the message, the receiver which decodes the message and thedestination, the person or thing on the other end. Shannon’s understanding of a message was as a “dynamic system whose future course is conditioned by its past history” (226). Shannon demonstrated this by randomly generating strings of letters and then imposing rules of associated frequency on them (“how common particular letters are, how common their occurrences with other letters happens, how frequent spaces should be and frequently words follow one another etc.) until sentences emerged which (although nonsensical) appeared to be English sentences. Also, “touch typists could handle them with increasing speed - another indication of the ways people unconsciously internalize a language’s statistical structure” (227). Shannon detailed this with mathematical formulas which accounted for the probability of all elements. “Quantifying predictability and redundancy in this way is a

Page 10: Gleick Information

backward way of measuring information content. If a letter can be guessed from a what comes before, it is redundant; to the extent that it is redundant it provides no new information” (230). Adding redundancy (as we saw earlier with the African Drummers and their additional context) reduces the possibility for errors. “Whether removing redundancy to increase efficiency or adding redundancy to enable error correction, the encoding depends on knowledge of the language’s statistical structure to do the encoding. Information cannot be separated from possibilities. A bit, fundamentally, is always a coin toss” (231). These could be measured by logarithms. In this way he was able to estimate the degree of information in bits. He drew out a graph of possible levels, phone record at 10(to the power)5. The largest collection he estimated was the library of congress, 10(to the power)14.

Chapter 8: The Informational Turn

The chapter begins with a short detailing of Claude Shannon’s relationship to MIT researcher Norbert Wiener. Wiener saw Shannon’s work as a strand of (the then undeveloped field of) Cybernetics by which he meant “the study of communication and control and also the study of the human and the machine” (235). The field also developed into a search for and commentary on the possibility of Artificial Intelligence. Shannon developed a small (mouse like) robot (251) which could traverse mazes using a system underpinned by his understanding of information theory outlined in the last chapter. Gleick then turns to the state of play of Psychology and the interplay between these various loosely connected disciplines. Referencing the psychologist Pavlov (of dog fame) who believed that “there is no mind, only behaviour” (257) information, communication and understanding were being associated in a new context. Returning to issues of cybernetics and AI, Gleick details Shannon’s work in 1948 on “how to program a machine to play chess” (265). Shannon understood his interrelation to other fields and the possible importance of is work which he described in a self-consciously humorous way

Page 11: Gleick Information

by including a quotation by E. E. Cummings into a paper: “some son-of-a-bitch will invent a machine to measure Spring with” (266).

Chapter 9: Entropy and its Demons

Entropy derives from the articulations about thermodynamics inherent in using steam engines. Entropy in this context simply describes the “unavailability of energy and its uselessness for work” (270). This is related to the first two laws of thermodynamics -

First law: The energy of the universe is constant.

Second Law: The entropy of the universe always increases.

“The universe is running down. It is a degenerative one-way street. The final state of maximum entropy is our destiny” (271). In this way entropy is a measure of order and disorder. Useable energy derives for disorder, heated metal interacting with cold water to create steam is one example. When everything becomes ordered there is no useable energy. Gleick then turns to how entropy relies on probability. It is only probable (very highly probable) that entropy will increase. Gleick uses James Clerk Maxwell’s example that the second law of thermodynamics has “the same degree of truth as the statement that if you throw a tumbler full of water into the sea, you cannot get the same tumblerful of water out again” (274). Maxwell outlined a hypothetical situation where a demon (later dubbed Maxwell’s demon) could adjust these odds and reverse entropy. This captured the imagination of scientists and also related to out lives as human beings. Life runs counter to entropy living beings “propagate structure - we disturb the tendency towards equilibrium” (281). “Organisms suck orderliness from its surroundings. Herbivores and carnivores dine on a smorgasbord of structure; they feed on organic compounds, matter in a well-ordered state, and return it “in a very much degraded form””(283). There is a paradox of the creation of order in life which disturbs the order around it. Other parts of the world order themselves (Crystals for example).

Page 12: Gleick Information

However, their structure is dull and predictable. “Life must depend on a higher level of complexity, structure without predictable reputation” (285). This discussion flows into the next chapter on genetics.

Chapter 10: Life’s Own Code

This chapter discusses uses of Shannon’s theory in biology and particularly genetics. Scientists found it useful when investigating genetics to think about information in a broad way rather than energy. As work became increasingly focused on codes, (in DNA for instance) the work of previous information theorists and cryptographers became a useful parallel and influence. Further down the road of discovery when in “the early 1960s [genetic code] turned out to be full of redundancy” Shannon’s theory’s became increasingly pertinent. Gleick then turns to Richard Dawkins the importance of putting primary importance on genetic codes and in viewing life as simply a way for these genes to self-replicate. Reusing Samuel Butler’s joke that “a hen is only an egg’s way of making another egg” seems counterintuitive but describes the self-replicating nature of life and its journey from a primordial soup to the comparative complex beings that populate the world. Other versions of Butler’s joke reinforce the usefulness of this counterintuitive way of thinking outside of genetics. In 1995 Daniel Dennett said that “A scholar is just a library’s way of making another library” (303). The redundancy of genes becomes clearer when we think of them as primary and the physical make-up of bodies and secondary. Errors and mutations create new forms which may or may not be better conveyances for genes. The individual bodies may die but the genes live on and increase in dominance. “The history of life is written in terms of negative entropy” (305). Gene’s cannot be conceived of as “fragments of nucleic acid. Such things are fleeting” (308). All copies must be considered as one. “The gene is not an informational carrying macromolecule. The gene is the information” (309). In parallel to the division between message and

Page 13: Gleick Information

medium a book is not the physical artefact and a piece of music is not a cd, vinyl or group of musicians. With this Gleick turns to memes.

Chapter 11: Into the Meme Pool (meme= transmisión no genética de comportamientos, etc = inconsciente colectivo o así)

Memes fundamental (numbers or colours) they are not physical artefacts (a hula hoop is not a meme) but “are complex units, distinct and memorable - units with staying power” (313). They are distinct cultural information which travels between people, places and time. If (as in the last chapter) humans are considered carries of genes, we can also be considered to be carriers of memes. They live and propagate themselves through us. Gleick considers the phrase “jumping the shark” as a good example of something which lived in a cultural consciousness with start-points but also a sense that we, its users, we not in control of it. Chain letters are also used as examples of memes which mirrored genes. Gleick quotes Fred Dretske who stated that “in the beginning there was information. The word came later” The word was a tool to perpetuate this information. “Most of the biosphere cannot see the infosphere; it is invisible, a parallel universe humming with ghostly inhabitants” (323) Humans are learning to see this world. But we are not entirely in control of it, Gleick ends the chapter with a question “who is master, who is slave?” (323).

Chapter 12: The Sense of Randomness

Page 14: Gleick Information

In this chapter Gleick discusses randomness and pseudo-randomness. Arguing against the Henri Poincare argument that chance is only a lack of knowledge of the laws and preconditions of a phenomena. He shows how randomness is not simply the opposite of order (coins can be flipped 01010101010101 give a sense of orderliness yet still be random) (328). He returns to Shannon “The more regularity in a message, the more predictable it is. The more predictable, the more redundant. The more redundant a message is, the less information it contains” (329). In essence the questions “how random and how much information turn out to be one and the same. They have the same answer” (329) Gleick relates this to Turing’s Universal computer and to the notion of incomputable numbers. Using Pi as an example, Gleick shows that if a number can (by an algorithm) be displayed in a shorter sequence then it is not random. He then turns to the mathematician Andrei Nikolaevich Kolmogorov who theorised upon probability and information. He approached the issue with three approaches: combinational, the probabilistic and algorithmic. The first two refinements of Shannon and the third (the algorithmic) which was separate from looking at ensembles of information and could focus on the specific objects. The simpler something was, the easier to compute the less information it conveyed. Gleick then discusses what makes numbers interesting. He refers to Noam Chomsky’s paper “Three Models for the Description of Language” which applied information theory to notions about formal language (345). He then discusses Ray Solomonoff and his desire to know whether machines could learn from experience. He then outlines how photographs, music and video for example show regularities and have redundancy and therefore are compressible in a way which conveys the same amount of information (347) (we are all familiar with this if we have played a YouTube video at varying qualities). Gleick then turns to art and argues that is interesting because it falls between pattern and randomness (in information theory terms) (353).

Chapter 13: Information is Physical

Page 15: Gleick Information

In this chapter Gleick discusses Joh

n Archibald Wheeler, Niels Bohr and Richard Feynman and the history of thought concerning what black holes are. Debates in quantum mechanics have led to the question of whether when something enters a black hole do its quantum states, structure and organisation remain enact: can information survive a black hole. “According to quantum mechanics, information may never be destroyed” (358). Gleick discusses Charles Bennett how argued that thought has energy and it “could not exist without someembodiment” (361). “Heat dissipation occurs only when information is erased” “forgetting takes work” (362). Gleick discusses qubits, entanglement and nonlocality and in outlining the field of debate in a seemingly unrelated field, quantum physics, he shows how debates about what information is have wide reaching implications in the physical world.

Chapter 14: After the Flood

This chapter begins with a discussion of Jorge Luis Borges’s notion of “The Library of Babel” which he goes on to relate to Wikipedia (275) In doing so he discusses the relationship between information and knowledge and raises questions concerning what the focus of cataloguing our world should be. Gleick also raises questions about Wikipedia’s process of summarising shared knowledge (rather than collecting and preserving existing texts) outside of the “individuals who might have thought it was theirs” (379). He touches on the nature of the medium “Wikipedia is not paper” and the value different social groups place on a variety of different knowledge, particularly obscure of niche subjects. He touches on how in a sense knowledge is getting lost and makes reference to Dickens’s The Pickwick Papers in which a character claims to know about Chinese metaphysics, whereas he had used an encyclopaedia to read “metaphysics under the letter M, and for China under the letter C, and combined his information” (387). He touches on how due to Wikipedia we seem to be running out of names (see disambiguation sections of Wikipedia)

Page 16: Gleick Information

and how this has always been a problem although its increase makes our world feel smaller and that “more complex societies demand more complex names” (389). This then relates to how much of our world is being constantly recorded and preserved to an unprecedented degree. Following on from this he then states that “it is finally natural - even inevitable - to ask how much information is in the universe” (397). And outlines Seth Lloyd’s process of treating the universe as a computer and calculating the number of possible operations in its history (in regards to Plank’s constant, limits on energy and other laws of physics). “Lloyd calculates that the universe can have performed something on the order of 10(to the power)120 “ops” in its entire history. Considering “every degree of freedom of every particle in the universe,” it could now hold something like 10(to the power)90 bits. And counting” (397).

Chapter 15: New News Every Day

Gleick begins by taking a step back from our current historical moment to assert that in every age (and in regards to every technology) “people said, as if for the first time, that a burden had been placed on human communication: new complexity, new detachment, and a frightening new excess” (398). Focusing on Elizabeth Eisenstein author of The Printing Press as an Agent of Change in which she details how the printing press was “a decisive point of no return in human history. It shaped the modern mind” (399). The shift to print not only meant they were cheaper and more accessible but the main power was in making these texts stable. “Print was trustworthy, reliable and permanent” (400). Many thought they were in an informational flood which was hampering their lives. Even though in very significant ways their knowledge of the world and of history was greater than it had ever been in a significant way. Gleick relates these ideas to David Foster Wallace’s concept of “total noise”: “a tsunami of available fact, context and perspective” which provides a sensation of a “loss of autonomy, of personal responsibility for being informed” (403). There is a perceived gap between information and

Page 17: Gleick Information

knowledge and knowledge and wisdom. He relates this to Alex Ross’s idea of the “infinite playlist” whereby ““anxiety in place of fulfilment, an addictive cycle of craving and malaise. No sooner has one experience begun than the thought of what else is out there intrudes” The embarrassment of riches.” (409). For this there are two strategies for coping: filter and search (409). Gleick defines filters as “blogs and aggregators” “editors and critics” (410). Search he defines as the engines of cyberspace: www search engines. These filtering devises make “rummaging in the library” impossible. “An unindexed Internet site is in the same limbo as a misshelved library book” (410). The improvement of these two concepts Gleick argues “are all that stands between this world and the Library of Babel” (410). Gleick relates these technologies which have done similar things throughout history “alphabetical indexes, book reviews, library shelving schemes and card catalogues, encyclopaedias, anthologies and digests, books of quotation and concordances and gazetteers” (411). He then relates these earlier technologies to our present ones, in particular Twitter and asks “Ask bloggers and tweeters: which is worse, too many mouths or too many ears?” (412).

Epilogue

Gleick takes a few pages to return to some of the books key topics and trajectories. He touches on McLuhan’s “Global Knowing” (413), Edouard Le Roy’s “noosphere” (414) and H. G. Wells’s “World Brain” (415). Referencing Wikipedia Gleick asks whether our technologies will be better at coping with the way in which information changes and avoids stability. Returning to Claude Shannon’s The Mathematical Theory of Communication he reminds the reader of his approach of disregarding meaning in favour of an approach the same as one would approach an “engineering problem” (416). Relating this to Epistemological standpoints which do not consider information without a context something worth investing thought in. Gleick asks is this cold world of information without meaning the one which we now live in. He asserts strongly

Page 18: Gleick Information

that this “is not the world I see” (418). He argues that our technologies have do more to draw us together and strengthen community than they have isolate us. He progresses through a number of examples where our technology, particularly that which seems dehumanising, has a positive effect. For instance in 2008 “Google created an earlier warning system for regional flu trends based on data no firmer than the incidence of Web searches for the word flu” (421). Harnessing this large amount of data which in many ways homogenises users allowed people to stay safer because of flu and get information not primarily being searched for which saved lives and reassured suffers and their families. Gleick briefly outlines a short history of search engines and the changes Google made to the current system. Giving links values related to a recommendation system (like academic citation). Gleick returns to the notion of individuals within a structure and argues there must always be a paradox at play “everything is close and everything is fart at the same time. This is why cyberspace can feel not just crowded but lonely.” (425). This is a paradox we all have to accept and prove beyond in a positive way as “we are all patrons of the Library of Babel now, and we are the librarians” (426). Information has always been essential in the world and always will be.