31
TICS: An Approach to The Concept of Information Presentation for INTAS seminar 21-23 September 2003, Moscow Pavel Luksha, Alexander Plekhanov

Revising the Theory of Information

Embed Size (px)

DESCRIPTION

Pavel Luksha. This paper, presented at ISSS 2003 and the INTAS project seminar series, redefines the concept of information as the interation between 'memory' and 'environment' components of a system.

Citation preview

Page 1: Revising the Theory of Information

TICS: An Approach to The Concept of Information

Presentation for INTAS seminar

21-23 September 2003, Moscow

Pavel Luksha, Alexander Plekhanov

Page 2: Revising the Theory of Information

Why Another Concept of Information?

Universal concept of science, a “third element” which is “neither matter nor energy” [Wiener, 1948]Some 150 definitions exist [Capurro, 1992]

Vagueness of concept leads to poor use and even misuse (instead of sticking to a certain commonly accepted definition)Yet, we suggest that it is possible to construct a uniform concept of information that will incorporate most of (intuitive) definitions and itself will become a basis for meta-theory

Page 3: Revising the Theory of Information

Three Approaches to a Concept of Information

Materialistic: information is a material phenomenon or essenceIdealistic: information is not a material phenomenon or essence“Fictionalistic”: it is neither phenomenon nor essence, but a “fiction” (or, a mental construct)

For scientific studies of information, it is necessary to consider information as objective material phenomenon (rationalistic tradition [Winograd, Flores, 1986])At the same time, it is an object of meta-theory, not theory (information systems on various ‘layers of existence’)

Page 4: Revising the Theory of Information

Major Groups of Systems with Information

A great diversity of systems in which information is thought to exist, and completely different material substrate

Technical systems with information (computers, robots, telecom means)

tech

nica

l

Self-reproducing systems starting from SR macromolecula

Biological organisms with central nervous system

Pre-social organization of higher animals

biol

ogic

al

“Super-organism” insect populations

Social groups of various level of organization (micro-groups to societies)

Psychic life (emerging socially, as shown by Vygotsky [1978])

Technical systems storing/ transmitting social information

soci

al

Page 5: Revising the Theory of Information

Invariant Property

We suggest that universal, or invariant, property of a diversity of “systems with information” is memory

No.

Type of information system Types of memory

1 technical   permanent and operative memory devices (memory and quasi-memory of technical systems)

2 biological   genetic innate memory  individual memory in central nervous system

3 social   genetic memory of social behavior  individual memory  social memory in individual memories and technical devices

Page 6: Revising the Theory of Information

Starting from interaction

Every complex system is in constant interactions with its environmentFor complex systems, these interactions are (a) diverse (more than one state exists) and (b) regularly repeated (Ashby’s analysis of cybernetic systems)Understanding information can start from analysis of interactions in systems of lesser complexity (computers, DNA, neurons) towards systems of higher complexity (psychics, societies)Consider two types of systems (similar ideas in Cottam [2003]): ‘discrete’, or ‘digitized’, which can be described as finite

state automata (FSA) ‘continuous’ (can be reduced to FSA, but cannot be

induced from them)

system

environment

Physics is no longer a science about properties of nature elements; it is about relations and interactions (physicist Yu. Rumer, lecturing)

Page 7: Revising the Theory of Information

What Is Memory

Memory can be most generally defined as a phenomenon of reflection structure and organization of one system are

reflected, and further are ‘stored’ for some time, in other system, and are used in interactions between these systems

Memory can be localized in a given system as its component DNA coding a cell, “cognitive eigenvalues” in

brain, social memory in society etc

According to a diversity of interaction types, memory elements can be identified (various interaction types correspond to various memory elements – but only in ‘discrete’ case)

memory

environment

Page 8: Revising the Theory of Information

What Is SAFE

System adaptive functioning environment (SAFE) A structure and organization of system memory can unambiguously be put into one-one correspondence with a set of objects and links in system’s environment. In basic case, a memory can contain nothing but representations of objects with which a system regularly interactsThe relationship of memory and environment distinguishes some specific set of objects and links in the latter - SAFE

Some similar concepts: ‘Umwelt’ of von Uexküll [von Uexküll, 1982] ‘field’ in theory of Kurth Lewin [Lewin, 1950] gestalt-/ environmental psychology, e.g. [Lettvine et al.,

1959], [Gibson, 1986] Lifeworld of Agre and Horswill [Agre, Horswill, 1997]

memory

SAFE

Page 9: Revising the Theory of Information

Defining Information

We propose to consider, as informational, only processes in which content of system memory is explicitly revealed through interactions (representation / transformation) with its environment (more precisely, SAFE)

In basic (‘discrete’) case, information is interaction between an element of memory and an element of SAFE

It should be distinguished between different informations (qualitatively new interactions) and copies of the same information (copies of same interaction)

Similar concepts: Gibson’s affordances Maturana’s system activities, etc.

Page 10: Revising the Theory of Information

Information as Regulation as Information

Each such interaction between memory and SAFE is a mutual change of (interacting parts) memory and SAFE. change in environment through its interaction with memory is a process of

regulation, while changes in memory through interaction with environment is an

identification, or a representation.

Thus, interaction between memory and SAFE always has two aspects: it is an information and a program at the same time. We propose to call them information/programs.Models which can be obtained with this approach represent a generalization of traditional models in theory of information, and traditional models of cybernetics; we propose to call them information/cybernetic models (or information/ cybernetic systems, ICS)

Page 11: Revising the Theory of Information

Actualization and Potentialization

In system dynamics, it is necessary to distinguish actual and potential form of information/programs: actual = interactions (current, or accomplished within some

period of time) between elements of SAFE and elements of memory

potential = interactions which are possible in principle but are not currently observed (i.e. all possible interactions between memory and SAFE).

A cycle of change between actualization/ potentialization – a swinging pendulum metaphor

Information / programs are potentially contained inside memory AND inside environment. They can only be actualized as interaction of these two.

memory

SAFE

Page 12: Revising the Theory of Information

Principle of Information RelativityInformation ‘omnipresence’ (Brillouin [1956], Stonier [1972]) – information as physical entity opposing entropyOur suggestion – there can never exist information in general. Any specific information/program exists only as a part of operation cycle

of some specific finite ICS. Only if an object in environment is represented in memory, there can exist

(and be realized) a corresponding information/program. Informational closure of a complex systems is implied Only those objects of environment reflected in a system memory may

actualize corresponding information/programs. A variety of information/programs is limited by the volume of system

memory A universal principle of relativity of information/program existence If there is no corresponding element of system memory representing some

object of environment, then, relative to this system, even in case of regular interactions with this object, there is no information/program. At the same time, relative to other system which possesses a proper element of memory, there exists a corresponding information/program.

Page 13: Revising the Theory of Information

How It Fits with Various Concepts?

Few examples (from [Scott, 2003]; can be extended)Ashby [1956]: “cybernetic system is closed to information” = information is strictly defined by system memoryKonorski [1962]: “information cannot be separated from its utilization” = information is regulation, and ‘manifests itself’ through interactionsMaturana, Pask: “cybernetic system is organizationally closed” = depending on system organization (and memory organization), information contained in system is differentStonier: “measure of order” = regularities (incl. sequences of repeated interactions) can be considered as order in system; more information means more (diverse) regularitieskey definitions of Shannon [1948] and Kolmogorov [1965] to be considered (matrix model)

Page 14: Revising the Theory of Information

Basic Information System

Set of relations/ interactions between given memory and SAFEMemory and SAFE as two groups of objects/links isomorphic to each other; without a “reference system”, two equally rightful descriptions “which one plays which role” are possibleMust be a uniform physical basis for interactions, a similarity (‘lock and key’, ‘eye and light’)

SAFEsystemmemory Clutched cog-

wheels model

Page 15: Revising the Theory of Information

information/ cybernetic system

Information/Cybernetic System (ICS)

Structure differentiated to (a) a component that interacts (communicates/ regulates) with external environment, and (b) a component that interacts (communicates/ regulates) with internal components, but not with external environment.

automate

Controlling device (memory)

Executing device (memory / SAFE)

internal regulation and communication:regulative interactions inside automate, internal communication channel

Contact environment

(SAFE)

external regulation and communication:regulative interactions between automate and external SAFE, external communication channel

Non-contact environment

regular interactions (exchange of matter and energy) between contact and non-contact environment

A perceptor model

Three cog-wheels?

Page 16: Revising the Theory of Information

Communication model of ICS

Communication not possible without alphabet, representation of signals in memorySignal or message = element of SAFEMessage can be transported in communication channel, but information only exists as interaction between message and recipient In system with differentiated internal memory, at least two informations exist (one as interaction with external SAFE, another as interaction with internal SAFE)

SAFEmemory memory / SAFE

internal interaction between components of information/cybernetic system

interaction between information/cybernetic system and its environment

internal communication/ regulation channel

external communication/ regulation channel

Page 17: Revising the Theory of Information

Shannon’s Communication Model

Three states of signal: (a) entering communication channel [encoding], (b) transported/waiting in channel [transmission], (c) existing channel [decoding]Two communicating agents: (a) for each one, another one is part of external SAFE, (b) each one has “her own” information (“what is said is not what is heard” hermeneutic principle), (c) for adequate communication, must exist a certain similarity between agents

memorymemory /

SAFEmemory /

SAFE

1st system 2nd system

memory

sender/ recipient 1

sender / recipient 2

coding/ decoding (sending/ receiving) device 1

coding/ decoding (sending/ receiving) device 2

Page 18: Revising the Theory of Information

Dynamic Aspect of Information/Programs

Information/ cybernetic system is a dynamic system that passes through a certain set of states information/program interactions between memory and SAFEThese states are regularly reproduced in some (certain pre-determined) sequence forming a cycle of ICS operationIt is evident that macro-cycle is not a completely arbitrary set of information/programs in arbitrary sequence; it is a quasi-targeted process related to a teleological aspect of ICS operation, and so are its sub-cyclesSuch cycle can be represented as an attractor focus point (evolution towards some final state, as in automata) limit cycle (repetition of a loop) strange attractor (case of complex synergetic systems)

Page 19: Revising the Theory of Information

Hierarchy of Functioning Cycles

Cycle type Information/program organization

Properties

elementary cycle of operation (micro-cycle)

actualization of a single information/ program

cannot be decomposed to lower level (lower complexity) cycles on a given level of abstraction

sub-cycle (meso-cycle)

combination of several information/ programs (in certain sequence)

can be decomposed to meso-cycles of lower level and microcyles; has a determined goal

ICS operation cycle (macro-cycle)

(repeated) cycle with a final major goal state and a full variety of information/ programs

can be decomposed to meso-/ micro-cycles; repeated cycle in self-maintaining and self-reproducing systems

Simplified example (higher animal reproductive behavior)

We consider organization of functioning cycles as hierarchical, Russian-doll structure (case of ‘discrete’ model)

full cycle of reproduction

breed care

courting

Page 20: Revising the Theory of Information

Sources of Activity in ICSA source of activity in ICS can either be internal memory, or external SAFE (also, spontaneous activity from ext. memory / internal SAFE). Two major types of meso-cylces (determined by cause-effect sequence): memory-driven [active, or pro-active ‘behavior’]: rigid sequence of

information/programs, elements of SAFE are ‘expected’ (as result, e.g. inefficient reflexes in animal behavior [Dewsbury, 1978], magic rituals [Levi-Strauss, 1962])

SAFE-driven [re-active ‘behavior’]: touch-string principle, explanation for ‘if-then’ sequences

memory-driven meso-cycle

SAFE-driven meso-cycle

external SAFEmemory memory /SAFE

1 2

Page 21: Revising the Theory of Information

Adequacy/Efficiency

One-one relations do not describe a variety of potential interactions between memory and SAFE (e.g. - non-complimentary junctions in DNA replication, inefficient reflexes)All information/programs can be distinguished by their value – a degree of adequacy/efficiency if actualization of some set of information/ programs ensures a macro-cycle

(or its sub-cycle) which is optimal according to some criterion, then this set of information/ program is adequate/efficient.

Two criteria can be suggested: evolutionary, or survival criterion: actualization of a given set of

information/programs assures a maximal repetition of a given macro-cycle / meso-cycle

functional criterion: actualization of a given set of information/ programs allows to achieve a maximal efficiency ratio in a given macro-cycle / meso-cycle

Summarize more traditional criteria: adequate identification (or exact recognition of signals, when a predator is

recognized as a predator, or a symbol as a symbol) efficient transformation, when a required result is achieved with maximal

accuracy and minimal ‘cost’ (time and energy) Based on criteria, it is evident that, in natural complex systems (emerged through evolution processes), one-one relations of memory and SAFE correspond to adequate/efficient information/programs

Page 22: Revising the Theory of Information

Major Conclusions

1. The key to information processes is found in dynamic relation between complex system and its environment

2. Information systems are systems with memory (and not only living, autopoetic or complex). Studies of systems with memory may become a new focus for FIS / UTI

3. This approach permits a solution of many methodological problems, and an alternative representation of system statics/dynamics (matrix model) which can be quite enlightening (e.g. adjustment to Ashby’s law).

Also – bridging traditional users of paradigm (e.g. computer scientists), and scholars looking for ‘clarified paradigm’ (humanity and biology scientists)

4. Applications already evident in social sciences (e.g., systematic theory of society) and in biology

Page 23: Revising the Theory of Information

Matrix Model

Modeling ‘discrete’ ICSSides of matrix model represent elements of memory and SAFE (element base), aligned in one-one relation to each otherCells of matrix correspond to element interactions, or information/programs: those in main diagonal are adequate/efficient, other (also possible) inadequate/ inefficientMatrix may represent a model of Shannon’s communication with errorsTwo possibilities: (a) any distortions lead to total inadequacy (adequacy 0 or 1); (b) (similarity between elements of element base) slight distortions lead to incomplete adequacy (between 0 and 1)

Matrix is around you

SAFE 1 2 3 4

mem

ory

1

2

3

4

Cells can be assigned quantity of occurrence (matrix Q), and (derived indicators of) probability/ frequency (matrices P, F), or “state of actualization” (matrix A)

Page 24: Revising the Theory of Information

Communication of ICS in Matrix Model: Conjoining Matrices

Junction of two matrices represent junction of memory and SAFE through an intermediate (external memory/ internal SAFE)An intermediate is thus ‘poly-functional’ (consideration of matrix allows to explain why it is typically not an initiator of activities), through which SAFE-driven or memory-driven cycles occurIf no distortions exist in internal or external relation, internal memory and external SAFE shall stand in one-one relation

ext. memory/ int. SAFE

inte

rnal

mem

ory

exte

rnal

SA

FE

Page 25: Revising the Theory of Information

Example: Watt’s Governor

changes speed of a working shaft

regulator weight lifts up / pulls down

a lever

lifts up / pulls down a regulator weight

lever opens / closes a pressure gate

working

impact

steam

pressure

A classical example of “first order” model of regulation with feedback

Page 26: Revising the Theory of Information

Example: Matrix model of Watt’s Governor (4 matrices)

wei

ght g

oes

dow

nw

eigh

t goe

s up

pres

sure

falls

pres

sure

gro

ws

shaft slows down shaft speeds uppressure gate closed

pressure gate opened

Page 27: Revising the Theory of Information

Example: Watt’s Governor (Cont.)

weight

steam in boiler

working shaftpressure gate

memory/ SAFE (execution unit no.1)

(effector)

memory/ SAFE (execution unit no.2)(receptor)

external SAFE (contact environment)

internal memory (controlling unit)For a full cycle, two

information/programs in each of four sub-systems must be actualized (VF=8)No repetitions (IF=VF=8)No inadequate/ inefficient information/programs (IA= VA= IF= VF=8)

Page 28: Revising the Theory of Information

Quantitative Measures of Information/Programs

Measures of information/program variety

Measures of information/program quantity with copies

indicated as IPions?

VF=jiAij full variety

VA=jiAij for i=j adequate/efficient variety

VN=jiAij for ij, or inadequate/inefficient variety

VN=VF-VA

L=limT T/R length of macro-cycleIF=jiFij full quantity

IA=jiFij for i=j adequate/efficient quantity

IN=jiFij for ij, or inadequate/inefficient quantity

IN=IF-IA

Page 29: Revising the Theory of Information

Limits to Variety and Adjustment to Requisite Variety Law

The law of requisite variety, as introduced by W. Ashby [Ashby, 1964], can be re-considered and re-stated A quantity of information/program variety must be greater to variety of controlled objects or events (various elements of SAFE). If there is no memory element for the given object or event in environment, then there is no information/program that can control it. On the other hand, an upper bound of system variety can be pointed out: it is the square of controlled environment disturbances (SAFE elements). In efficiently operating information/cybernetic system, a variety of controlling information/programs tends to a number of SAFE elements.

Limits for information/program variety

k VF k2 full variety

VA=k adequate/efficient variety

0 VN k2-k inadequate/inefficient variety

V=VA/VF variety efficiency ratio

Page 30: Revising the Theory of Information

Limits to Information/Program Quantity with Copies

Assuming that each potential information/program appears in macro-cycle at least once, following correspondences can be drawn

Limits for information/program quantity with copies

IA L limits of adeaquate/efficient information quantity

IN L-IA limits of inadeaquate/inefficient information quantity

VA IA relation between adeaquate/efficient variety and quantity

VF IF relation between full variety and quantity

I=IA/IF information/program efficiency ratio

Page 31: Revising the Theory of Information

Measures of Shannon and Kolmogorov

i j ji

ij

zp

pI

K(X) = min |p|: U(p)=X K(X|Y) = min |p|: U(p, Y)=X I(Y:X) = K(X) – K(X|Y)

Shannon’s measure

Kolmogorov’s measure•Kolmogorov defines information through information, referring to properties of Turing automate

•Shannon implicitly assumes “a concise recipient” of information who has a “function of expectation” and thus may be “surprised” by information•Yet, this measure is one possible static measure of (dis)organization in information interactions, which can be easily derived from matrix model as one possible characteristics of matrix P

•In matrix model, if the target (final) state of system is considered as X, then measure analogue to Kolmogorov’s will indicate (minimal) number of information/programs required to reach X from state Y

L = L(X) = IA

L2 = L(Y:X) = L(X) – L(Y) < IA