Math Logic

Embed Size (px)

Citation preview

Westfalische Wilhelms-Universitat Munster Institut fur Mathematische Logik und Grundlagenforschung

Mathematical LogicWolfram Pohlers Thomas Glaworked out and supplemented by Lecture given by

An Introduction to

Typewritten by Martina Pfeifer

Mathematical LogicWolfram Pohlers Thomas Gla

An Introduction to

Institut fur Mathematische Logik und Grundlagenforschung Westfalische Wilhelms-Universitat Munster Einsteinstra e 62 D 48149 Munster

a Typeset AMS-L TEX

Preface

This text is based on my personal notes for the introductory courses in Mathematical Logic I gave at the university of Munster in the years 1988 through 1991. They have been worked out and supplemented by Thomas Gla to whom I express my sincere thanks for a well done job. Our courses have been planned for freshmen in Mathematical Logic with a certain background in Mathematics. Though self contained in principle, this text will therefore sometimes appeal to the mathematical experience of the reader. According to the aim of the lectures, to give the student a rm basis for further studies, we tried to cover the central parts of Mathematical Logic. The text starts with a chapter treating rst order logic. The examples for application of Gentzen's Hauptsatz in this section give a faint avour how to apply proof theoretical methods in Mathematical Logic. Fundamentals of model theory are treated in the second chapter, fundamentals of recursion theory in chapter 3. We close with an outline of other formulations of ( rst order and non rst order) logics. Nearly nothing, however, is said about set theory. This is usually taught in an extra course. Thus there is an appendix in which we develop the small part of the theory of ordinal and cardinal numbers needed for these notes on the basis of a naive set theory. One of the highlights of this text are Godel's incompleteness theorems. The true reason for these theorems is the possibility to code the language of number theory by natural numbers. Only a few conditions have to be satis ed by this coding. Since we believe that a development of such a coding in all its awkward details could mystify the simple basic idea of Godel's proof, we just required the existence of a suitable arithmetisation and postponed the details of its development to the appendix.

iv

Preface

I want to express my warmest thanks to all persons who helped in nishing this text. Besides Thomas Gla { who did the work of a co-author { I want to mention Andreas Schluter in the rst place. He did not only most of the exercises but also most of the proof-reading. Many improvements are due to him. My thanks go also to all our students who detected and reported errors in a rst version and gave us many helpful critical remarks. We do not regard these notes as nished. Therefore we are still open for suggestions and criticism and will appreciate all reports about errors, both typing errors and possible more serious errors. Last but not least I want to thank our secretary Martina Pfeifer who TEXed the main bulk of this paper. Munster, October 1992 Wolfram Pohlers

ContentsHistorical Remarks : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 1 Notational Conventions : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 3

1 Pure Logic

Heuristical Preliminaries : : : : : : : : : : : : : : : : : : : 1.1 First Order Languages : : : : : : : : : : : : : : : : : 1.2 Truth Functions : : : : : : : : : : : : : : : : : : : : 1.3 Semantics for First Order Logic : : : : : : : : : : : : 1.4 Propositional Properties of First Order Logic : : : : 1.5 The Compactness Theorem for First Order Logic : : 1.6 Logical Consequence : : : : : : : : : : : : : : : : : : 1.7 A Calculus for Logical Reasoning : : : : : : : : : : : 1.8 A Cut Free Calculus for First Order Logic : : : : : : 1.9 Applications of Gentzen's Hauptsatz : : : : : : : : : 1.10 First Order Logic with Identity : : : : : : : : : : : : 1.11 A Tait-Calculus for First Order Logic with Identity :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

: : : : : : : : : : : :

5 6 10 17 27 35 45 48 53 66 82 85

5

2 Fundamentals of Model Theory

2.1 Conservative Extensions and Extensions by De nitions : : : : : : : : : : 93 2.2 Completeness and Categoricity : : : : : : : : : : : : : : : : : : : : : : : 99 2.3 Elementary Classes and Omitting Types : : : : : : : : : : : : : : : : : : 109 Primitive Recursive Functions : : : : : : : : : : : : : : : : : : : : Primitive Recursive Coding : : : : : : : : : : : : : : : : : : : : : Partial Recursive Functions and the Normal Form Theorem : : : Universal Functions and the Recursion Theorem : : : : : : : : : Recursive, Semi-recursive and Recursively Enumerable Relations Rice's Theorem : : : : : : : : : : : : : : : : : : : : : : : : : : : : Random Access Machines : : : : : : : : : : : : : : : : : : : : : : Undecidability of First Order Logic : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 116 : 124 : 130 : 134 : 136 : 142 : 145 : 148

93

3 Fundamentals of the Theory of Decidability3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8

115

4 Axiom Systems for the Natural Numbersv

4.1 Peano Arithmetic : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 153 4.2 Godel's Theorems : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 158

153

vi

Contents 5.1 Many-Sorted Logic : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 165 5.2 !-Logic : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 172 5.3 Higher Order Logic : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 175

5 Other Logics Appendix

165

A.1 The Arithmetisation of NT : : : : : : : : : : : : : : : : : : : : : : : : : 179 A.2 Naive Theory of the Ordinals : : : : : : : : : : : : : : : : : : : : : : : : 188 A.3 Cardinal Numbers : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 198

179

Bibliography Glossary Index

Historical Texts : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 203 Original Articles : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 203 Text Books : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 206

203

209 215

Historical RemarksNowadays mathematics is an extremely heterogeneous science. If we try to nd a generic term for mathematical activities we encounter the astonishing di culty of such an enterprise. In former times mathematics has been described as the `science of magnitudes'. Nowadays this is no longer true. We regard a science as a mathematical science mainly not because of its contents but rather because of its methods. The characteristic of a mathematical theory is that it proves its claims in an exact way, i.e. it derives its results from generally accepted basic assumptions without using further information such as empirical experiments etc. Then the next question to be asked is: \What does it mean to `derive something from something?'", i.e. what is a mathematical proof? Still in the last century a proof was more or less a matter of intuition. A sentence was regarded to be a theorem when it was accepted by the mathematical community. We know `proofs' of theorems which are considered to be false in these days (although the theorem is true which is a point for the intuition of the researchers involved). However, it seems to have been clear at all times that `logical' reasoning should be ruled by laws and the e orts to investigate them reach back to the times of antiquity. The oldest known `logical system' is Aristotle's 384 B.C., y322 B.C.] Syllogistic. We will not describe Syllogistic here. All we want to say is that it is by far too weak to describe mathematical reasoning. About the same time there was also some research on logical reasoning by the Megarians and Stoics, which in some sense was much more modern than that of Aristotle. The in uence of Aristotle's work was tremendous. It ruled the Middle Ages. The fact that the Roman Church had taken up { with some adjustments { Aristotle's philosophy created an over-great respect for Aristotle's work. This together with other traditions paralysed logical research and restricted it mainly to work-outs of Aristotle's systems of Syllogistic. In that time a remarkable book with new ideas was Ars Magna (1270) by Raimundus Lullus 1235, y1315] in which he suggested that all knowledge in the sciences is obtained from a number of root ideas. The joining together of the root ideas is the `ars magna' (the great art). Lullus himself did not really develop a theory but his ideas still in uenced Leibniz' work. One of the lasting challenges of Lullus' ideas was the development of a general language for a general science. The rst one to have the idea of developing a general language in a mathematical way was Rene Descartes 1596, y1650]. But because of the great in uence of the Roman Church he did not publish his ideas.

2

Historical Remarks Such attempts were made by Gottfried Wilhelm Leibniz 1646, y1716]. In his

mathematics as the key-science which should be able to decide all questions `in so far as such decisions are possible by reasoning from given facts'. So he tried to develop a general algorithm which could decide any question (obeying the just mentioned restrictions). Using that algorithm he wanted to decide whether God exists. Of course he failed (he had to, as we know today). The real start of mathematical logic were George Boole's 1815, y1864] books The Mathematical Analysis of Logic (1847) and The Laws of Thought (1854). Three decades later, Gottlob Frege 1848, y1925] published his book Begri sschrift, eine der arithmetischen nachgebildete Formelsprache des reinen Denkens (1879). In the work of both authors the central point is a formalisation of the notions of `sentence' and `inference'. Boole, in uenced by the work of William Hamilton 1780, y1856] and Augustus De Morgan 1806, y1878], opted for an algebraic notation while Frege designed an arti cial language on the model of colloquial language. Because of its complicated two-dimensional notations Frege's formalisation did not succeed. Boole's concept of an algebra of logic still had some aws which prevented it from being commonly accepted. Nowadays, after having taken out the errors in Boole's concept, the notion of a boolean algebra became central in mathematical logic. The breakthrough in the development of mathematical logic was the Principia Mathematica (1910, 1912, 1913) by Alfred North Whitehead 1861, y1947] and Bertrand Russell 1872, y1970]. Their notions relied on previous work by Giuseppe Peano 1858, y1932]. His book Formulaire de Mathematique (1897) presented a completely developed formalism for the theory of logic and thus launched what we call mathematical logic nowadays. Kurt Godel 1906, y1978] and other pioneers of modern mathematical logic used the `Principia' as their main reference. Mathematical logic investigates mathematical reasoning by mathematical methods. This self-referential character distinguishes it from other elds of mathematics and is the reason why logic sometimes is regarded as a somewhat strange part of mathematics. The best example of the kind of strangeness we mean are the famous Godel's incompleteness theorems which show us the limits of formalisations. These theorems are included in this book. So the reader is advised to convince her- or himself about the things we claim here. Nowadays mathematical logic is divided into four sub elds: Recursion Theory Set Theory Model Theory Proof Theory Having discussed the basics of logic (pure logic) we are going to obtain some connections to model theory (fundamentals of model theory). After that we develop the basic

De arte combinatoria (1666) he suggested a `mathematics of ideas'. Leibniz regarded

Notational Conventions

3

notions of recursion theory (fundamentals of the theory of decidability) and turn to the fundamentals of proof theory in the fourth chapter. In the last chapter we will regard some other formulations of logic.

Notational Conventionsi stands for if and only if. ; denotes the empty set. IN is the set of natural numbers 0; 1; 2; : : : If f : X ! Y is a function we call X the domain of f, i.e. X = dom(f): The range of f is the set rg(f) = fy 2 Y : 9x2X(f(x) = y)g: If f : X ! Y is a function and Z X, then f Z denotes the restriction of f to Z, i.e. f Z : Z ! Y f Z(x) = f(x) for x 2 Z: X Y is the set of functions f : Y ! X: Pow(X) denotes the power set of X, i.e. the set of all subsets of X. X n Y denotes the set X without Y , i.e. the set

fx 2 X : x 2 Y g: =X Y denotes the union of X and Y, i.e. the set

fx : x 2 X or x 2 Y g:X \ Y denotes the intersection of X and Y, i.e. the set

fx : x 2 X and x 2 Y g:idX is the identity on X, i.e. dom(idX ) = X and idX (x) = x for x 2 X:

4

Notational Conventions

Chapter 1 Pure LogicHeuristical PreliminariesIn the historical remarks we already emphasised that the development of a formal language belongs to the main tasks of mathematical logic. To see what will be needed for a formal language of mathematics we are going to examine examples of mathematical propositions. Let us start with a simple one 5 j 15; i.e. the natural number 5 divides the natural number 15, or (3 + 4) = (2 + 5): These propositions tell us facts about natural numbers. The rst one tells us that the two natural numbers 5 and 15 share the property that one (the rst) divides the other (the second). Such properties which may be shared by one or more objects (natural numbers in our example) will be called predicates . The number of objects which may share a predicate is called the arity of the predicate. The equality of natural numbers, for instance, is a binary predicate. Whenever we have an n-ary predicate P and n objects o1; : : : ; on, then (Po1 : : :on ) is a proposition , something which either can be true or false. The second example is a bit more complex. In it we do not longer compare two objects but two things, 3 + 4 and 2 + 5 which are built up from objects by functions . Such things will be called terms . Terms can be evaluated and the evaluation will yield an object. Thus objects in predicates may well be replaced by terms and still represent a proposition. We may even replace objects occurring in terms by terms and still obtain a term. To get a uniform de nition we could say that every object is a term and that more complex terms are obtained by applying an n-ary function f to already formed terms t1 ; : : : ; tn, i.e. by building (ft1 : : :tn ). Once propositions are formed, we may compose them to more complex ones by using sentential connectives. In colloquial language we compose ones by connectives such as `and' it is raining and I'm taking my umbrella,

6

I. Pure Logic

`or' it is raining or the sun is shining, `if : : : then' if it is raining, then I'm going to use my umbrella, `not' etc. We will use the symbols ^; _; !; : to denote these connectives. Of course we will have to give them a mathematically exact meaning (which of course should be as close as possible to their meaning in colloquial language, because it is colloquial language which conserves our long experience in thinking). This will be done in section 1.2. The use of sentential connectives, however, does not exhaust all the possibilities of forming more complex propositions. In mathematics we usually deal with structures. Let's take the example of a group. A group G consists of a non-empty set G of objects, together with a binary group operation, say , a neutral element, say 1, and the equality relation. The only propositions which are legal with respect to our hitherto collected rules are equations between terms and their sentential combinations. But this does not even allow us to express that 1 is the neutral element of G . In order to do that we have to say something like: 1 x = x and x 1 = x for all objects x in G: Here x is a symbol for an arbitrary element of G, i.e. x is a variable . This variable is quanti ed by saying for all x in G. Thus we also have to introduce the universal quanti er 8x (or some other name, say x; y; : : :, for the variable). The universal quanti er alone does not su ce to express that G is a group. To formulate the existence of the inverse object we have to say

8x there exists an object y with x y = 1:Thus we need also the dual of the universal quanti er, the existential quanti er 9x. Altogether this means that in order to describe a group in our formal language, we have to allow object variables replacing objects in the formation for terms and quanti ers binding them. These are all the ingredients for a rst order language. This language is already quite powerful. E.g. it su ces for the formalisation of group axioms, ring axioms etc. However, one can imagine much more powerful languages. So we might introduce variables for predicates and quanti ers binding them. This would be called a second order language. Third or even higher order languages languages can be obtained by iterating this process, i.e. allowing the quanti cation over predicates on predicates etc. Now we close our preliminary words and try to put these ideas into mathematical de nitions.

1.1 First Order LanguagesIn the heuristical preliminaries we spoke about creating a language for mathematics. Now we are going to put those informal ideas into mathematical de nitions. After

1.1. First Order Languages

7

having de ned precisely the formal expressions and their meaning we will analyse the expressive power of so-called rst order logic. The heuristical studies of the previous section give us already a clear picture how to design a formal language. All we have to do in this section is to translate this picture into a mathematical de nition. The strategy will be the following. To design a language rst we have to x its alphabet . Then we need grammars which tell us how to get regular expressions out of the letters of the alphabet. In rst order languages the regular expressions will be the terms and the formulas. De nition 1.1.1. The alphabet of a rst order language consists of 1. countably many object variables, denoted by x; y; z; x0; : : : ; 2. a set C of constant symbols, denoted by c; d; c0; : : : ; 3. a set F of function symbols, denoted by f; g; h; f0; : : : Every function symbol f 2 F has an arity #f 2 f1; 2; 3; : :: g: 4. a set P of predicate symbols, denoted by P; Q; R; S; P0; : : : Every predicate symbol P 2 P has an arity #P 2 f1; 2; 3; : :: g: 5. the sentential connectives ^ (and), _ (or), : (not), ! (implies) and the quantiers 8 (for all), 9 (there is). 6. parentheses (, ) as auxiliary symbols. A rst order language is obviously characterised by the sets C (the constant symbols), the set F (the function symbols) and the set P (predicate symbols). We call the elements of C F P the non-logical symbols of a language. All the other symbols are variables, sentential connectives, quanti ers and auxiliary symbols. They are called logical symbols. They do not di er in rst order languages. To emphasise that L is a rst order language depending on the sets C ; F ; and P we often write L = L(C ; F ; P ): First order languages are denoted by L; L0; : : : To make this de nition more visible we are going to give an example: think of formalising group theory. We declare a rst order language L(CGT ; FGT ; PGT ) in which we are able to make all statements concerning (elementary) group theory. There we have a constant symbol 1 for the neutral element a function symbol for the group operation and # = 2 a predicate symbol = for the equality relation on the group. = is binary, too. Thus we have CGT = f1g; FGT = f g and PGT = f=g: Using this alphabet we are able to talk about statements concerning a group. But how do we build up (regular) statements? This will be done in general (for any rst order language) in two steps. In the rst step we will declare how to use variables, constant and function symbols. The expressions obtained in this step will be called terms, and in the second step we will introduce how to use predicate symbols, sentential connectives and quanti ers to obtain expressions called formulas.

8

I. Pure Logic

De nition 1.1.2. Let L(C ; F ; P ) be a rst order language. We simultaneously list the rules for term formation and for the computation of the set FV(t) of variables occurring free in the term t: 1. Every variable x and every constant symbol c 2 C is a term. It is FV(x) = fxg and FV(c) = ;. 2. If t1 ; : : : ; tn are terms and f 2 F is a function symbol of arity #f = n; then (ft1 : : :tn) is a term. It is FV(ft1 : : :tn ) = FV(t1 ) : : : FV(tn ). If f is binary we usually write (t1 ft2 ) instead of (ft1 t2) and call (t1 ft2 ) the in x notation of the term (ft1 t2 ). Terms are denoted by r; s; t; r0; : : : Because terms depend on the given language we will call them L-terms if we want to emphasise the language. With the alphabet for group theory we can build up the following terms ( x1) ( ( 1x)( 1( yz))) which doesn't look like something concerning groups because usually one would like to use the in x notation in connection with binary function and predicate symbols. Then the above terms are read as (x 1) ((1 x) (1 (y z))) and we have the free variables FV((x 1)) = fxg FV((1 x) (1 (y z))) = fx; y; z g: I.e. FV(t) is the set of variables occurring in the term t: De nition 1.1.3. Let L(C ; F ; P ) be a rst order language. Simultaneous to the grammar for formulas we introduce the rules for the computation of the sets FV(F) of variables occurring free and BV(F) of variables occurring bound in the formula F. 1. If t1; : : : ; tn are L-terms and P 2 P is a predicate symbol of arity #P = n; then (Pt1 : : :tn ) is a formula. It is FV(Pt1 : : :tn) = FV(t1) : : : FV(tn ) and BV(Pt1 : : :tn ) = ;. If P is binary we often write (t1 Pt2) instead of (Pt1t2). 2. If F and G are formulas, then so are:(:F); (F ^ G); (F _ G); (F ! G): It is FV(:F) = FV(F); BV(:F) = BV(F) and FV(F G) = FV(F) FV(G), and BV(F G) = BV(F) BV(G) for 2 f^; _; !g:

1.1. First Order Languages

9

3. If F is a formula and x is a variable such that x 2 BV(F), then (8xF ) and (9xF ) = are formulas with FV(QxF ) = FV(F) n fxg and BV(QxF ) = BV(F) fxg for Q 2 f8; 9g: Formulas are denoted by F; G; H; F0; : : : Thus formulas depend on the language, too. If we want to stress this fact we will call them L-formulas. If L does not contain any predicate symbol, there are no L-formulas. So from now on we will assume that we have P = ;. 6 Using the in x notation again we have obtained formulas of the shape (8x((1 x) = x)) (x = y) (8x(1 = y)) In these cases we have the sets FV((8x((1 x) = x))) = ;; BV((8x((1 x) = x))) = fxg FV((x = y)) = fx; yg; BV((x = y)) = ; FV((8x(1 = y))) = fyg; BV((8x(1 = y))) = fxg. In the third case of De nition 1.1.3 we have a condition on the variable for building formulas. Thus (8x(9x(x = x 1))) is not a formula because x 2 BV((9x(x = x 1))): The grammars in De nitions 1.1.2 and 1.1.3 (and in further de nitions to come) are often called inductive de nitions. An inductive de nition is given by a set of rules. The least xed point of an inductive de nition is the smallest set which is closed under all the rules in the inductive de nition. A set is inductively de ned if it is the least xed point of an inductive de nition. The important feature of inductively de ned sets is that we may prove properties of their elements by induction on its de nition , which means the following principle: To show that all elements of some least xed point M share a property ' it su ces to show that ' is preserved under all the rules in the inductive de nition of M. We will use `induction on the de nition' over and over again starting with quite simple situations. Quite easy examples for `induction on the de nition' are given in the exercises. We agree upon the following notations and conventions: Formulas which are built according to the rst clause in De nition 1.1.3 are called atomic . A term t with FV(t) = ; is called closed .

10

I. Pure Logic

A formula F with FV(F) = ; is called a sentence . Up to now we have described the objects of interest of the rst chapter: rst order languages. But in this section we only spoke about the syntax of a rst order language: about its alphabet and about its regular expressions. In the next two sections we are going to develop the semantics of rst order languages. Section 1.2 is devoted to give meaning to the sentential connectives ^; _; :; ! which are only syntactical symbols without any meaning. There we will see that rst order languages are powerful enough to represent any truth function (a semantical object, cf. De nition 1.2.1) by some kind of syntactical expression.

Exercises

concatenated words are denoted by writing them one behind the other. 1. MI is a permitted word. 2. If xI is a permitted word, so is xIU: 3. If Mx is permitted, so is Mxx: 4. If xIIIy is permitted, so is xUy: 5. If xUUy is permitted, so is xy: Prove the following claims: a) MUUIU is a permitted word. b) MU is not permitted. E 1.1.2. The set M IN is de ned inductively by: 1. 2 2 M 2. Is n 2 M; so n + 3 2 M: Prove: It is n 2 M i there is an m 2 IN with n = 3m + 2:

E 1.1.1. We de ne the set of permitted words (i.e. non-void nite strings) over the alphabet fM; U; I g by the following inductive de nition. There x; y are words and

1.2 Truth FunctionsUp to now we have only developed the syntax of rst order languages. A term or a formula is nothing but a well-formed sequence of letters according to the rules of the respective grammar. Therefore we need to x the meaning of the letters and of the expressions formed out of the letters. We start by xing the meaning of the sentential connectives. The purpose of sentential connectives is to connect propositions. A proposition is something which either can be true or false. Thus sentential connectives can be regarded as syntactical counterpart of truth functions and we have to develop a theory of truth functions. We represent the truth value `true' by t, f stands for `false'.

1.2. Truth Functions

11

De nition 1.2.1. An n-ary truth function is a map from ft; f gn into ft; f g.Now we have to give a precise meaning to the sentential connectives of colloquial language. That is easy for negation. We de ne the truth function : : ft; f g ! ft; f g by : (t) = f and : (f) = t: It is also easy for ^ and _ . To make their de nition more visible we arrange it in form of truth tables. ^ t f _ t f t t f t t t f f f f t f These truth tables are to be read in the following way. ^ and _ are binary truth functions. The rst argument is in the vertical column left of the vertical line, the second in the horizontal row above the horizontal line. The value stands at the crossing point of the row of the rst and the column of the second argument. A bit more subtle to de ne is the implication ! formalising the colloquial if : : : then . The truth table of ! is t t f f t t This is the way how implication is de ned in classical logic. The controversies about the meaning of implication reach back to the times of Megarians and Stoics. What people annoys is the `ex falso quodlibet' saying that ! (f; ) is always true independent from the second argument. However, there are other interpretations of the colloquial if : : : then statements leading to di erent kinds of logic and thus also to di erent kinds of mathematics. One example is intuitionistic logic which interprets if A, then B in such a way that a proof for fact A can be converted in one for fact B. In this lecture we will restrict ourselves to the classical interpretation of implication as given in the above truth table. Usual mathematics is based on classical logic which uses the classical interpretation of implication. To study the theory of truth functions more generally we are going to introduce a formal language. The alphabet consists of propositional variables, denoted by a; b; a0; : : : all connectives, i.e. names for all truth functions. Now we are able to build up expressions only with respect to their sentential structure. Think of propositional variables as of variables for propositions (or formulas). The expressions build up only by this means are called sentential forms.

! t f

12

I. Pure Logic

De nition 1.2.2. We de ne the sentential forms inductively as follows.1. Every propositional variable is a sentential form. 2. If A1 ; : : : ; An are sentential forms and ' is an n-ary connective (i.e. a name for an n-ary truth function ' ), then ('A1 : : :An ) is a sentential form. Sentential forms are denoted by A; B; C; A0; : : : Think of sentential forms build up by :; ^; and _: Then (using in x notation) (:a) (((:a) ^ b) _ a) are examples for sentential forms. Now let's make some conventions to spare parentheses: outer parentheses will be cancelled : binds more than ^; _; ! and ^; _ more than !, i.e. we will write :A ^ B ! C _ :C instead of (((:A) ^ B) ! (C _ (:C))) we will write A1 ! A2 ! : : : ! An instead of (A1 ! (A2 ! (: : : ! An) : : :)): This will also be the case if we replace ! by ^ or _. Now we want to formalise that we think of propositional variables as variables for propositions (which either can be true or false). Therefore we think that we have associated a truth value with every propositional variable. Then we are able to determine the truth value of a sentential form by successive application of the corresponding truth functions. A boolean assignment is a map B : A ! ft; f g; where A denotes the set of propositional variables. Boolean assignments are denoted by B ; B 0 ; : : : We de ne the value B (A) of a sentential form A induced by a boolean assignment B as follows. De nition 1.2.3. We de ne B (A) for sentential forms A inductively. It is de ned according to the de nition of the sentential forms. 1. If A 2 A , then B (A) is already given by the assignment. 2. If A = ('A1 : : :An ); where ' is a name for the n-ary truth function ' , then B (A) = ' (B (A1 ); : : : ; B (An )).

1.2. Truth Functions

13

If B is a boolean assignment with B (a) = f and B (b) = t, then we obtain in the above example B (:a) = : B (a) = t B (((:a ^ b) _ a)) = (( : B (a)) ^ B (b)) _ B (a) = (( : f) ^ t) _ f =t Now we give a rst example for using `induction on the de nition'. Proposition 1.2.4. If A is a sentential form and B is a boolean assignment, then B (A) 2 ft; f g: Proof by induction on the de nition of `A is a sentential form'. 1. If A 2 A , then B (A) 2 ft; f g according to the de nition of a boolean assignment. 2. If A = ('A1 : : :An ), then we have (B (A1 ); : : : ; B (An )) 2 ft; f gn by the induction hypothesis (which applies because A1 ; : : : ; An are previously de ned sentential forms). Since ' is a name for an n-ary truth function ' and B (A) = ' (B (A1 ); : : : ; B (An )); it is B (A) 2 ft; f g. Let A be a sentential form and fa1 ; : : : ; ang the set of propositional variables occurring in A: It is obvious from De nition 1.2.3 that B (A) only depends on B fa1 ; : : : ; ang (i.e. B restricted to the nite set fa1 ; : : : ; ang). There are only 2n many boolean assignments which di er on fa1; : : : ; ang. This means that there is an obvious algorithm for computing B (A) which consists in writing down B (a1 ); : : : ; B (an ) for the 2n many assignments which di er on fa1; : : : ; ang and then computing B (A) according to the truth tables for the functions represented by the connectives occurring in A. For a precise formalisation of this fact cf. the appendix (Lemma A.1.11). De nition 1.2.5. Let A and B be sentential forms. We say that A and B are sententially equivalent, written as A B; if B (A) = B (B) for any boolean assignment B. Proposition 1.2.6. is an equivalence relation on the sentential forms, i.e. we have A A A B entails B A A B and B C entail A C: The following proposition gives a list of equivalent sentential forms.

Proposition 1.2.7. a) A ^ B B ^ A; A _ B B _ A: b) :(A ^ B) :A _ :B; :(A _ B) :A ^ :B:

14

I. Pure Logic

c) :(:A) A: d) (A ^ B) ^ C A ^ (B ^ C); (A _ B) _ C A _ (B _ C): e) (A ^ B) _ C (A _ C) ^ (B _ C); (A _ B) ^ C (A ^ C) _ (B ^ C): f) A ! B :A _ B: The proofs are obtained by mere computation of both sides. At this point we want to single out some connectives, respectively some truth functions, by which all other connectives, respectively truth functions, can be represented in a way as ! is represented in Proposition 1.2.7 by : and _: a) A sentential form A1 ^ : : : ^ An, in which every Ai ; i = 1; : : : ; n is either a propositional variable or of the form :ai for ai 2 A , is a pure conjunction. b) Dually a sentential form A1 _ : : : _ An , where the Ai ; i = 1; : : : ; n are as above, is called a pure disjunction (or clause). c) A sentential form A1 ^ : : : ^ An, where all the Ai (i = 1; : : : ; n) are pure disjunctions, is a conjunctive normal form. d) Dually a sentential form A1 _ : : : _ An with pure conjunctions Ai (i = 1; : : : ; n) is a disjunctive normal form. The aim of the following theorem is to obtain an equivalent disjunctive normal form for arbitrary sentential forms. How the normal form can be computed (not in the general situation) will be performed in the following example: (a _ :c) ^ (b _ c) (a ^ (b _ c)) _ (:c ^ (b _ c)) (a ^ b) _ (a ^ c) _ (:c ^ b) _ (:c ^ c) (a ^ b) _ (a ^ c) _ (:c ^ b) using Proposition 1.2.7 and the fact B (:c ^ c) = f for all boolean assignments B . Theorem 1.2.9. Let A be a sentential form. Then there is a disjunctive normal form B such that A B. Proof. Let A be a sentential form. Then there are only nitely many propositional variables, say a1 ; : : : ; an; occurring in A. We have 2n boolean assignments B 1 ; : : : ; B 2n which di er on fa1; : : : ; ang. Now we de ne sentential forms ( Aik = ak if B i (ak ) = t :ak if B i (ak ) = f for i = 1; : : : ; 2n and k = 1; : : : ; n: Here we have B i (Aik ) = t

De nition 1.2.8.

1.2. Truth Functions and for i 6= j there is a k 2 f1; : : : ; ng such thatB i (ak )

15

6= B j (ak ): 6

This entails

B j (Ajk ) = B i (Ajk )

and since B j (Ajk ) = t we have B i (Ajk ) = f: Fitting parts together we have for the pure conjunctions Ci = Ai1 ^ : : : ^ Ain; i = 1; : : : ; 2n the fact that B i (Cj ) = t i i = j since B i (Cj ) = t just in the case if B i (Ajk ) = t for all k = 1; : : : ; n: Without loss of generality we may assume that we have numbered the boolean assignments in such a way that B i (A) = t for i = 1; : : : ; m and B j (A) = f for j = m + 1; : : : ; 2n: If m = 0 de ne B = a0 ^ :a0 : Then we have that B is a disjunctive normal form with A B since B (A) = f = B (B) for all boolean assignments B . If m 6= 0 de ne the disjunctive normal form B = C1 _ : : : _ Cm : Then it is for i = 1; : : : ; mB i (B)

= B i (Ci ) = t = B i (A) = f = B j (A)

and for j = m + 1; : : : ; 2n since it is

B j (B) B j (C1 ) =

So we have for all i = 1; : : : ; 2n and we can conclude A B:

: : : = B j (Cm ) = f: = B i (B)

B i (A)

16

I. Pure Logic

such that A B. Proof. To prove the corollary we observe that, in view of Proposition 1.2.7 the negation of a disjunctive normal form is equivalent to a conjunctive normal form and vice versa. Thus choose a disjunctive normal form B0 equivalent to :A which exists by 1.2.9. Then by 1.2.7 A ::A :B0 which by the above remark is equivalent to a conjunctive normal form. De nition 1.2.11. Let M be a set of connectives, i.e. names for some xed truth functions. We call M complete if for every sentential form there is an equivalent sentential form only containing connectives from M. So we obtain as another immediate corollary of Theorem 1.2.9: Theorem 1.2.12. f:; ^; _g; f:; _g and f:; ^g are complete sets of connectives. Proof. From Theorem 1.2.9 we see that f:; ^; _g is complete. But according to 1.2.7 we can express ^ by : and _, and _ by : and ^. At this point we have obtained a justi cation for taking only the connectives

Corollary 1.2.10. For any sentential form A there is a conjunctive normal form B

^; _; :; !into the alphabet of a rst order language (cf. De nition 1.1.1) since every other connective can be represented by them.

Exercises

t f t f t t One may think of j as a connective. Prove that fjg is a complete set of connectives. De ne the connectives :; !; _; ^ using only j. E 1.2.2. The binary truth function : ft; f g2 ! ft; f g is given by: t f t f f f f t One may think of as a connective. Is f g complete? E 1.2.3. Let f 1 ; : : : ; ng be a complete set of connectives. Prove or disprove that f : 1 ; : : : ; : n g is complete. E 1.2.4. Prove: f^; _g is not complete. E 1.2.5. Is f^; _; !g a complete set of connectives?

E 1.2.1 (She er stroke). The binary truth function j: ft; f g2 ! ft; f g is given by: j t f

1.3. Semantics for First Order Logic

17

E 1.2.6. A king puts a prisoner to a severe test. He orders the prisoner into a roomwith two doors. Behind each door there may be a tiger or a princess. Choosing the door with the princess the prisoner will be set free. Otherwise he will be torn into pieces by the tiger. Knowing that the prisoner is a logician, the king has mounted two signs to the door. The choice of the room doesn't make any di erence. The princess is in the other room.

He gives the following information to the prisoner:\If the princess is behind the door on the left hand side, the sign at that door is true. If there is the tiger so it's false. With the other door it is just the other way round."

What door should be chosen by the prisoner? a) Formalise the exercise. b) Determine the equivalent disjunctive normal form and use it to help the prisoner to make his decision.

1.3 Semantics for First Order LogicHaving discussed the meaning of the sentential connectives now we can turn to x the meaning of the terms and formulas introduced in section 1.1. The rst step in that direction is to tell the meaning of the non-logical symbols of a rst order language. De nition 1.3.1. An L(C ; F ; P)-structure is given by a quadruple

S = (S; C ; F; P)satisfying the following properties: 1. S is a non-void set. It is called the domain of the structure. 2. It is C = fcS : c 2 Cg S: 3. It is F = ff S : f 2 Fg a set of functions on S such that f S : S n ! S if #f = n: 4. P = fP S : P 2 Pg is a set of predicates on S, i.e. for P 2 P with #P = n it is P S S n : Let's give an easy example: Let LGT = L(CGT ; FGT ; PGT ) be the language of group theory. In an LGT -structure we have interpretations for 1; and =. Thus, if G = (G; 1G; G) is a group we have an LGT -structure with domain G interpreting 1 by 1G which is 1G 2 G by the function G which is G : G2 ! G; (x; y) 7! x G y

18

I. Pure Logic

= by the predicate =G which is f(x; x) : x 2 Gg G2 Thus every group is an LGT -structure. But we also have very strange LGT -structures, e.g. if we have the following structure domain of the structure is IN is interpreted by the function f : IN2 ! IN; (x; y) 7! 2x = is interpreted by the predicate f(x; y) 2 IN2 : x < yg This structure has of course nothing to do with a group. Now we are going to give a meaning to the syntactical material of section 1.1 (i.e. terms and formulas) with respect to a given structure, i.e. with respect to the meaning of the non-logical symbols. In a rst step we are going to assign elements of the domain of the structure to the variables. By an assignment for an L(C ; F ; P )-structure S = (S; C ; F; P) we understand a map : V! S where V denotes the set of object variables. Assignments are denoted by ; ; ; 0 ; ::: Now let L = L(C ; F ; P ) be a rst order language, S = (S; C ; F; P) an L-structure and an assignment for S : By we have interpreted the variables in S : Now we can lift the interpretation to all L-terms. De nition 1.3.2. The value tS ] of an L-term t in the L-structure S with respect to the assignment is de ned by induction on the de nition of the L-terms as follows: 1. If t is the variable x, then tS ] = (x): 2. If t = c, then tS ] = cS : 3. If t = (ft1 : : :tn ); then tS ] = f S (tS ]; : : : ; tS ]): 1 n For an example regard the group G = (Z +) of the integer numbers as an LGT ;0; structure. Take the term t = (1 x) y and an assignment for G with (x) = 5 and (y) = 3: Then it is tG ] = 2: Proposition 1.3.3. Let S be an L-structure and an S -assignment. a) tS ] 2 S for any L-term t. b) If t is an L-term with FV(t) = ;, then tS ] = tS ] for all S -assignments and . In this case we write brie y tS : Proof. We only prove the rst part at full length. This is an induction on the de nition of `t is an L-term'. Therefore we have the following cases:

1.3. Semantics for First Order Logic

19

t=x Then it is tS ] = (x) 2 S by the de nition of : t=c2C Then it is tS ] = cS 2 S by the de nition of cS : t = (ft1 : : :tn ) with L-terms t1 ; : : : ; tn and f 2 F By the de nition of f S it is f S : S n ! S and by the induction hypothesis we have tS ]; : : : ; tS ] 2 S: 1 n So it is tS ] = f S (tS ]; : : : ; tS ]) 2 S: 1 n This nishes the induction. The proof of the second part is an induction on the de nition of `t is an L-term', too. There one need not consider the case t = x because of FV(t) = ;: In the next step we are going to de ne the truth value ValS (F; ) of an L-formula F under an assignment for S . To simplify the de nition we introduce x , 8y 2 V(x 6= y ) (y) = (y)): This means that the assignments and di er at most at the variable x: De nition 1.3.4. We de ne ValS (F; ) by induction on the de nition of the formulas. 1. If F is an atomic formula (Pt1 : : :tn) we put ( S S S ValS (F; ) = t if (t1 ]; : : : ; tn ]) 2 P f otherwise 2. ValS (:F0; ) = : (ValS (F0; )) 3. ValS (F1 ^ F2; ) = ValS (F1 ; ) ^ ValS (F2 ; ) 4. ValS (F1 _ F2; ) = ValS (F1 ; ) _ ValS (F2 ; ) ( 5. ValS (8xF0 (x); ) = t if ValS (F0 ; ) = t for all x f otherwise ( 6. ValS (9xF0 (x); ) = t if ValS (F0 ; ) = t for some x f otherwise Instead of ValS (F; ) = t we commonly write S j= F ]: Thus S 6j= F ] means ValS (F; ) = f: To make clauses 5. and 6. in De nition 1.3.4 better conceivable we are going to prove that both clauses meet the intuitive understanding of the quanti ers 8 and 9 (cf. Lemma 1.3.7). Though more perspicuous this

20

I. Pure Logic

alternative formulation has the disadvantage that it needs a bigger apparatus. We denote by Fx (t) the string which is obtained from the string F by replacing all occurrences of x by t, similar for sx (t): Proposition 1.3.5. If F is an L-formula and t is an L-term such that FV(t) \ BV(F) = ; and x 2 BV(F); = then Fx(t) is a formula with FV(Fx (t)) FV(t) (FV(F) n fxg) and BV(Fx (t)) BV(F): Proof. This is left to the reader as an exercise. From now on we always tacitly assume that FV(t) \ BV(F) = ; and x 2 BV(F) = whenever we write Fx(t). Lemma 1.3.6. Let s; t be L-terms, F an L-formula and S an L-structure. If and are S -assignments such that x and (x) = tS ]; then sS ] = sx (t)S ] and ValS (F; ) = ValS (Fx (t); ): Proof. First we show sS ] = sx(t)S ] by induction on the de nition of s. If s = y 6= x, then sS ] = (y) = (y) because x . If s = x, then sS ] = (x) = tS ] = sx (t)S ]: If s = c 2 C , then sS ] = cS = sx (t)S ]: If s = (fs1 : : :sn ), then by the induction hypothesis sS ] = f S (sS ]; : : : ; sS ]) 1 n S (s1x (t)S ]; : : : ; snx(t)S ]) =f = sx (t)S ]: Next we show ValS (F; ) = ValS (Fx (t); ) by induction on the de nition of F. If F = (Ps1 : : :sn ), then S j= F ] i (sS ]; : : : ; sS ]) 2 P S 1 n which holds i (s1x (t)S ]; : : : ; snx(t)S ]) 2 P S by the rst part. But this means S j= Fx (t) ].

1.3. Semantics for First Order Logic If F = :F0, then

21

ValS (F; ) = : (ValS (F0; )) = : (ValS F0x(t); ) = ValS (:F0x(t); ) where the equation between the second and the third term holds by induction hypothesis. In the following we will indicate this by writing: =i:h: . If F = (F1 F2), where is a connective ^; _; !, then ValS (F; ) = ValS (F1; ) ValS (F2 ; ) =i:h: ValS (F1x(t); ) ValS (F2x(t); ) = ValS (Fx (t); ): If F = 8yG and S j= F ]; then S j= G Thus x 6= y. Let 0 y . Then de ne ( 0 (z) = (z) (z) Theny

0] for all 0 y . We have x 2 BV(F). =

for z 6= x for z = x:

because for z 6= y we have (z) = 0 (z) = (z) = (z) if z 6= x and (z) = (z) for z = x. Thus we have S j= G ]: We have x 0 by de nition and obtain (x) = (x) = tS ] = tS 0 ] since y 0 and y 2 FV(t) because of y 2 BV(F) and FV(t) \ BV(F) = ;: = Hence S j= Gx(t) 0 ] by induction hypothesis. Since 0 was an arbitrary assignment such that 0 y this entails S j= 8yGx (t) ]: For the opposite direction assume S j= 8yGx (t) ]: Let 0 y : De ne ( 0 (z) = (z) for z 6= x (z) for z = x:

22 Theny

I. Pure Logic

because we have for z 6= x (z) = 0(z) = (z) = (z) and (z) = (z) for z = x: Hence S j= Gx(t) ]: But x 0 and 0(x) = (x) = tS ] = tS ] because BV(F) \ FV(t) = ; and y 2 BV(F). Thus S j= G 0] by the induction hypothesis. Since 0 was arbitrary with 0 y this means S j= 8yG: The case that F = 9yG is similar and left to the reader. If S = (S; C ; F; P) is an L-structure we may extend L = L(C ; F ; P ) to a language LS = L(C S; F ; P ) where S = fs : s 2 S g and expand S to an LS -structure SS = (S; C S; F; P) where sSS = s: It is obvious that any SS -assignment is also an S -assignment and vice versa because an assignment only depends on the domain of a structure. Thus LS is obtained from L by giving `names' to the elements of S. Here look at an easy example. Let L = LGT the language of group theory and S = (Z; 0; +) be the group of the integer numbers. Then S is the set fz : z 2 Z g where z is nothing but a new constant symbol. In the expanded structure SS we interpret z (which is thought to be a name for the object z) by the object z: Lemma 1.3.7. Let F be an L-formula and S an L-structure. Then: a) S j= 8xF ] i SS j= Fx(s) ] for all s 2 S: b) S j= 9xF ] i SS j= Fx(s) ] for some s 2 S: Proof. Before we start the proof we make the general observation that S j= 8xF ] i SS j= 8xF ] because F is an L-formula (cf. Exercise E 1.3.5). To show the direction from left to right in a) assume SS j= 8xF ] and choose an arbitrary s 2 S: De ne ( (y) = (y) if y 6= x s if y = x

1.3. Semantics for First Order Logic

23

Then x and (x) = s = sSS ]: Hence SS j= F ] which entails SS j= Fx (s) ] by Lemma 1.3.6. For the opposite direction assume SS j= Fx (s) ] for all s 2 S. Let x and set s = (x): Then SS j= Fx (s) ] which by 1.3.6 entails S j= F ]: Hence S j= 8xF ]: The proof of b) runs analogously. We see from 1.3.7 that we really captured the colloquial meaning of 8 and 9 by De nition 1.3.3. Before we continue to investigate the semantical properties we introduce some frequently used phrases. De nition 1.3.8. Let L be a rst order language and M a set of L-formulas. a) M is satis able in S if there is an S -assignment such that S j= F ] for all F 2 M: b) M is valid in S if S j=F ] for all F 2 M and all S -assignments : c) M is satis able or consistent if there is an L-structure S such that M is satis able in S . d) M is valid if M is valid in every L-structure S . For a formula F we denote by S j= F that F is valid in S and by j= F that F is valid. Let's illustrate this de nition by some examples. Therefore let LGT be the language of group theory and G be a group (which is an LGT -structure, too). M = f(x = 1)g is satis able in G because if we take an G -assignment with (x) = 1G we have G j= x = 1 ]: If G is not a group with only one element M = f(x = 1)g is not valid in G because if we take g 2 G with g 6= 1G and an G -assignment with (x) = g we have

G 6j= x = 1 ]:Now, let AxGT be the set of `axioms' of group theory, i.e. the set of the following formulas

are just the groups. AxGT f8x8y(x y = y x)g is consistent because there are commutative groups. It follows from De nition 1.3.4 that ValS (F; ) only depends on FV(F). Thus we have the following property:

8x8y8z(x (y z) = (x y) z) 8x(x 1 = x) 8x9y(x y = 1) Thus the LGT -structures S interpreting = by f(x; x) : x 2 S g with AxGT valid in S

24

I. Pure Logic

Proposition 1.3.9. Let S be an L-structure, F an L-formula and and S -assignments such that FV(F) = FV(F). Then it isValS (F; ) = ValS (F; ): It follows from 1.3.9 that ValS (F; ) does not depend on if F is a sentence. Thus sentences have a xed truth value in an L-structure S . That is the reason for calling them sentences. For sentences there is no di erence between satis ability and validity with respect to a xed structure. If F is an L-sentence, then F is satis able in an L-structure S i it is valid in S . If F is an L-formula with FV(F) fx1; : : : ; xng and an assignment such that (xi) = ai , then we often write S j= F a1; : : : ; an] instead of S j= F ]: According to 1.3.9 ValS (F; ) is determined by a1 ; : : : ; an: De nition 1.3.10. Let L be a rst order language and F; G L-formulas. We say that F and G are semantically equivalent if ValS (F; ) = ValS (G; ) for any L-structure S and any S -assignment . We denote semantical equivalence by F S G. Lemma 1.3.11. We have F S G i j= (F ! G) ^ (G ! F). Proof. If F S G, then ValS (F; ) = ValS (G; ) for any structure S and any S assignment . According to the truth tables of ! this entails ValS (F ! G; ) = t and ValS (G ! F; ) = t: Hence j= (F ! G) ^ (G ! F): For the opposite direction assume ValS (F; ) 6= ValS (G; ) for some S and S -assignment . If ValS (F; ) = t, then ValS (F ! G; ) = f: Hence 6j= (F ! G) ^ (G ! F): If ValS (F; ) = f; then ValS (G ! F; ) = f and again we get 6j= (F ! G) ^ (G ! F): We de ne the connective $ by F $ G = (F ! G) ^ (G ! F). This means that the truth function interpreting $ is given by this combination of the truth functions for conjunction and implication. Then 1.3.11 reads as F S G i j= F $ G:

1.3. Semantics for First Order Logic

25

Proposition 1.3.12. The semantical equivalence of L-formulas is an equivalence relation on the L-formulas.The following proposition gives a list of semantically equivalent formulas.

formulas involved. Thus a) to f) follow from 1.2.7 just because of the propositional structure of the involved formulas. We are going to study the propositional structure and properties of rst order formulas in the next section. Thus the precise argument will be given by Corollary 1.4.6. Thus all we have to check is g). Assume S j= :9xF ] for some L-structure S and an S -assignment . Then S 6j= 9xF ] which says that there is no S -assignment x such that S j= F ]; i.e. we have S j= :F ] for all S -assignments x : Hence S j= 8x:F ]: If S j= 8x:F ], then S j= :F ] for all x which shows that there is no S -assignment x such that S j= F ]: Hence S j= :9xF: The second part follows from the rst by the computation

Proposition 1.3.13. a) F ^ G S G ^ F; F _ G S G _ F b) :(F ^ G) S :F _ :G; :(F _ G) S :F ^ :G c) :(:F) S F d) (F ^ G) ^ H S F ^ (G ^ H); (F _ G) _ H S F _ (G _ H) e) (F ^ G) _ H S (F _ H) ^ (G _ H); (F _ G) ^ H S (F ^ H) _ (G ^ H) f) F ! G S :F _ G g) :(9xF ) S 8x(:F); :(8xF ) S 9x(:F) Proof. Claims a) to f) do obviously not depend on the quanti er structure of the

:8xF

S

:8x:(:F)

S

:(:9x:F)

S

9x:F:

ExercisesE 1.3.1.a) Prove: j= 8x(F ^ G) ! 8xF ^ 8xG: b) Let L be a rst order language including a constant symbol 0: Determine Lformulas F and G with

6j= 8x(F _ G) ! 8xF _ 8xG:

E 1.3.2. Let L be a rst order language and P a predicate symbol of L: Which of thefollowing formulas are valid? a) (F ! G) ! ((F ! :G) ! :F)

26 b) 8xF ! 9xF c) 8y9xP yx ! 9x8yPyx d) 9xF ^ 9xG ! 9x(F ^ G)

I. Pure Logic

E 1.3.3.

a) Let S be an L-structure. Prove that if S j= G ! F and x 2 FV(G) = so one has S j= G ! 8xF b) Is the condition x 2 FV(G) necessary? Prove your claim. = E 1.3.4. Prove Lemma 1.3.5. Hint: Show rst that for a term s also sx (t) is a term with FV(sx (t)) FV(t) (FV(s) n fxg): Do we have in general FV(Fx(t)) = FV(t) (FV(F) n fxg)? E 1.3.5. Let F be an L-formula, S an L-structure and an S -assignment. Prove: S j= F ] , SS j= F ]:

E 1.3.6.

E 1.3.7.

a) Determine a rst order language LV S suited for talking about a vector space and its eld. Hint: use a binary predicate symbol `='. b) Formulate a theory (a set of sentences) TV S in the language LV S such that for all LV S I-structures S (i.e. LV S -structures interpreting =S by f(s; s) : s 2 S g, cf. also section 1.10) one has: S j= TV S , S consists of a eld and vector space over this eld. c) De ne the LV S I-structure S of the continuous functions over the eld R of the real numbers. d) Determine LV S -formulas F and G such that the following holds in all LV S Istructures S with S j= TV S : 1. s1 ; : : : ; sn 2 S are linear independent , S j= F s1; : : : ; sn]: 2. s1 ; : : : ; sn 2 S form a vector space basis , S j= G s1 ; : : : ; sn ]: a) Let S be an L-structure and an S -assignment. Prove: S j= 9xF ] , S j= 9yFx (y) ] if y 2 FV(F) BV(F): =

1.4. Propositional Properties of First Order Logic

27

b) Is the condition y 2 FV(F) BV(F) in a) necessary? Prove your claim. = ~ c) Let S be an L-structure and a S -assignment. Now let F and F be two Lformulas which are obtained by renaming bounded variables. Prove: ~ S j= F ] , S j= F ]:

E 1.3.8. Let F be an L-sentence with FV(F) = fx1; : : : ; xng. Prove for any Lstructure S it is S j= F , S j= 8x1 : : : 8xnF: E 1.3.9. Let LIN = (0; 1; +; ; -Ax

75

; ? is an axiom, then >-Ax has to be an axiom, too. Another easy consequence is that the Tait-calculus with >-axiom is a conservative extension of the usual Tait-calculus. We will return to conservative extensions later (cf. section 2.1). Therefore we do not give a general de nition of conservative extensions but formulate the result as follows.

Proposition 1.9.12. Assume >-Ax nor ?: Then T :>-Ax

and

does neither contain the symbol >

Proof. The proof of the lemma is straightforward by induction on the de nition of: We are now prepared to formulate the general version of the interpolation theorem.

Theorem 1.9.13. If >-Ax ; , then there is an interpolation formula for : Proof. In case that we have 6 >-Ax

and

and 6 >-Ax we use Proposition 1.9.11, Proposition 1.9.12 and Theorem 1.9.9. Otherwise we either have >-Ax or >-Ax and obtain >-Ax ; ? or >-Ax ; ?; respectively. In the rst case we have the interpolation formula ? because we have >-Ax >; by >-axiom and in the second case we get for the same reasons > as interpolation formula. As a consequence of Theorem 1.9.13 we get the famous theorem by William Craig published in 1957.

Theorem 1.9.14 (Craig's interpolation theorem). If we have j= F ! G, then there is a formula E which interpolates F and G, i.e. we have j= F ! E and j= E ! Gand E contains only predicate symbols which occur both in F and G: The free variables of E also occur as well in F as in G:

Proof. If j= F ! G we get >-Ax F; G by the completeness theorem for the Taitcalculus with >-axiom. By Theorem 1.9.13 there is an interpolation formula E for F; E and >-Ax E; G which yield j= F ! E and F and G, i.e. we have > j= E ! G by the soundness-Ax theorem. In Theorem 1.9.13 we have proved that E has all the other properties stated in the claim. There is a nice application of the interpolation theorem which is due to Abraham Robinson 1918, y1974] in 1956. Theorem 1.9.15 (Joint consistency theorem). Let M1 and M2 be consistent sets of L-sentences. Then M1 M2 is consistent i there is no sentence F such that M1 j= F and M2 j= :F:

76

I. Pure Logic

Proof. If there is a sentence F such that M1 j= F and M2 j= :F it is M1 M2 inconsistent. For the other direction let M1 M2 be inconsistent. By the compactness theorem there are nite subsets N1 M1 and N2 M2 such that N1 N2 is inconsistent. Now let F1 be the conjunction of the sentences in N1 and F2 of those in N2 . Thus we have N1 j= :F2 which is j= F1 ! :F2: By Craig's interpolation theorem there is a sentence E such that j= F1 ! E and j= E ! :F2: But this implies M1 j= E since M1 j= F1 and F1 j= E: But also we have M2 j= :E because F2 j= :E and M2 j= F2 : This is because N1 N2 is inconsistent, which means N2 j= :F1: So we have shown that there is a sentence E such that M1 j= E and M2 j= :E:There is a sharper form of the interpolation theorem due to Roger C. Lyndon in 1959. This theorem also tells something about the form of occurrences of the predicate symbols in the interpolation formula. To formulate the theorem we need the following notion. De nition 1.9.16. We de ne inductively the positive and negative occurrence of a predicate symbol P in an L-formula F: 1. P occurs positively in Pt1 : : :tn : 2. If P occurs positively (negatively) in F, then P occurs negatively (positively) in :F: 3. If P occurs positively (negatively) in F, then P occurs positively (negatively) in (F _ G) and (G _ F): 4. If P occurs positively (negatively) in F, then P occurs positively (negatively) in 9xF: For an L-formula F let us denote its translation into the Tait-language LT by F T : Formally the translation is given inductively by 1. (Pt1 : : :tn )T = Pt1 : : :tn 2. (:F)T = (F T ) (cf. De nition 1.8.2)

1.9. Applications of Gentzen's Hauptsatz 3. (F _ G)T = F T _ GT 4. (9xF )T = 9xF T :

77

On the other hand any formula in the Tait-language LT is easily retranslated into an L-formula translating an occurrence of Pt1 : : :tn into :Pt1 : : :tn : Positive and negative occurrences of predicate are easy to locate in the Tait-language. We have the following observation.

Proposition 1.9.17. P occurs positively (negatively) in F i P (P) occurs in F T :The proof is an easy exercise.

Theorem 1.9.18 (Lyndon's interpolation theorem). If j= F ! G, then there isan interpolation formula E for F and G such that every predicate symbol occurring positively (negatively) occurs positively (negatively) in both formulas F and G:

Proof. The proof uses Proposition 1.9.17. By the interpolation theorem for the Taitcalculus with >-axiom we get an interpolation formula for :F T and GT : Any predicate symbol P occurring in E occurs as P in ::F T and GT and thus positively in F and G and any predicate symbol P occurring in E occurs as P in F T and GT : The retranslation of E into an L-formula transfers occurrences of P into positive occurrences of P and occurrences of P into negative ones. A sometimes useful modi cation of the interpolation theorem is the following one. Theorem 1.9.19. If M j= F for some formula set M (not necessarily nite), then there is a formula E with FV(E) FV(M) \ FV(F) such that M j= E and j= E ! Fand every predicate symbol occurring positively (negatively) in E also occurs positively (negatively) in F and some formulas of M:

theorem proves the theorem. Theorem 1.9.19 has the consequence that for proving a theorem F from the axiom system Ax we need at most those axioms in Ax which tell something about the predicate symbols occurring in F: There are two nice applications of the interpolation theorem. The rst is Evert W. Beth's de nability theorem which is already a consequence of Craig's interpolation theorem.

Proof. By the compactness and the deduction theorem we get j= G1 ^ : : : ^ Gn ! F for nitely many formulas fG1; : : : ; Gng M: Application of Lyndon's interpolation

78 an n-ary predicate symbol P implicitly if we have (1.5)

I. Pure Logic

Theorem 1.9.20 (Beth's de nability theorem). We say that a formula F de nesj= F ^ FP (Q) ! 8x1 : : : 8xn((Px1 : : :xn ) $ (Qx1 : : :xn)): j= F ! 8x1 : : : 8xn(Px1 : : :xn $ G)

fx1; : : : ; xng;

Let P be implicitly de ned by F: Then there is a formula G such that FV(G)

and P does not occur in G: This means that G de nes P explicitly. Proof. From (1.5) we get

j= F ^ Px1 : : :xn ! (FP (Q) ! Qx1 : : :xn):By Craig's interpolation theorem there is an interpolation formula G such that (1.6) and (1.7) (1.7)

j= F ^ Px1 : : :xn ! G

j= G ! (FP (Q) ! Qx1 : : :xn): We have FV(G) fx1; : : : ; xng and neither P nor Q occur in G: Thus we get from(1.8)

j= G ! (F ! Px1 : : :xn): j= F ! 8x1 : : : 8xn (G $ Px1 : : :xn):

and (1.6) and (1.8) together yield

The second application uses Lyndon's version of the interpolation theorem. It is a theorem about monotone operators. To de ne monotone operators let L be a rst order language and S an L-structure. Let P be a new unary predicate symbol. An L(P)-formula F with FV(F) = fxg de nes an operator F : Pow(S) ! Pow(S) byF (N) = fs 2 S : (S ; N) j= F

s]g

for N S where (S ; N) is the L(P)-expansion of S interpreting P by N S: An operator : Pow(S) ! Pow(S) is monotone on S if N M entails (N) (M): F is globally monotone if it is monotone on every L-structure S : Lemma 1.9.21. Let F be an L(P)-formula with FV(F) = fxg and at most positive occurrences of P: Then F de nes a globally monotone operator.

1.9. Applications of Gentzen's Hauptsatz

79

Proof. It su ces to prove (1.9) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! F ! FP (Q)for P occurring at most positively in F and (1.10) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (FP (Q) ! F) for P occurring at most negatively. We prove (1.9) and (1.10) simultaneously by induction on the length of F: If F = Pt1 : : :tn, then P occurs positively and we have j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn ) ! Pt1 : : :tn ! Qt1 : : :tn: If F = :G and P occurs positively (negatively) in F, then P occurs negatively (positively) in G: By induction hypothesis we have (1.11) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (GP (Q) ! G) or (1.12) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (G ! GP (Q)); respectively. But (1.11) and (1.12) immediately entail (1.9) and (1.10). If F = G _ H and P occurs positively (negatively) in F, then P occurs at most positively (negatively) in both G and H: Thus we have by induction hypothesis (1.13) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (G ! GP (Q)) and (1.14) j= 8x1 : : : 8xn (Px1 : : :xn ! Qx1 : : :xn ) ! (H ! HP (Q)); or we have (1.15) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (GP (Q) ! G) and (1.16) j= 8x1 : : : 8xn (Px1 : : :xn ! Qx1 : : :xn ) ! (HP (Q) ! H); respectively. But (1.13) and (1.14) entail j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (G _ H ! (G _ H)P (Q)) and (1.15) and (1.16) j= 8x1 : : : 8xn (Px1 : : :xn ! Qx1 : : :xn) ! ((G _ H)P (Q) ! G _ H): If F = 9xG and P occurs positively (negatively) in F, then P occurs positively (negatively) in G and we get (1.17) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (G ! GP (Q))

80 or (1.18)

I. Pure Logic

j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (GP (Q) ! G) j= 8x1 : : : 8xn (Px1 : : :xn ! Qx1 : : :xn ) ! (9xG ! 9xGP (Q))

by induction hypothesis. From (1.17) or (1.18), however, we immediately get and

j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! (9xGP (Q) ! 9xG):

Now we may use Lyndon's interpolation theorem to show that indeed any globally monotone operator is de nable by a P-positive formula F: Theorem 1.9.22. A rst order de nable operator is globally monotone i it is de nable by some P-positive formula. Proof. The one direction of the theorem is Lemma 1.9.21. To prove the opposite direction let F be globally monotone. Then we have (S ; P; Q) j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ! 8x(F ! FP (Q)) for any structure S and any expansion (S ; P; Q) of S : Thus

j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ^ F ! FP (Q)By Lyndon's interpolation theorem we get an interpolation formula E; i.e. (1.19) (1.20) Since Q only occurs positively in

j= 8x1 : : : 8xn(Px1 : : :xn ! Qx1 : : :xn) ^ F ! E j= E ! FP (Q): 8x1 : : : 8xn (Px1 : : :xn ! Qx1 : : :xn ) ^ F

positive occurrences of P:

j= F $ EQ (P); i.e. F is logically equivalent to a formula EQ (P) which has at most

it can at most occur positively in E: Thus choosing Q as P in (1.19) and (1.20) we get

Exercises

E 1.9.1. Let F; G be quanti er free formulas of the rst order language L: Let 0 be a constant symbol, f and j j function symbols and < a predicate symbol. Compute aprenex form of the following formulas: a) 8xF $ 9xG

1.9. Applications of Gentzen's Hauptsatz

81

b) 8x(0 < x ! 9y(0 < y ^ 8z(jz x0j < y ! jf(z) f(x0 )j < x))): E 1.9.2. Let P; Q; R be binary predicate symbols of the Tait-language LT : Compute with respect to the proof of the interpolation theorem for the Tait-calculus, an interpolation formula for 8x(9yPxy ^ 9yQxy) and 8x9y(Pxy _ Rxy) Hint: First determine a derivation for 8x(9yPxy ^ 9yQxy) ! 8x9y(Pxy _ Rxy) in the Tait-calculus. E 1.9.3. Prove that every monotone operator has a least xed point, i.e. if is the operator the least xed point is a set s with a) (s) = s b) 8t( (t) = t ! s t) E 1.9.4. Prove or disprove: For every rst order language L containing only nitely many constant and function symbols there is an n 2 IN such that for every L-formula 9xF with T 9xF there are t1; : : : ; tm ; m n with Fx(t1 ) _ : : : _ Fx (tm ): E 1.9.5. Let L(C ; F ; P) be a rst order language with C F P nite. Let P; Q be unary predicate symbols not in L; S a nite L-structure and R S: We call R is S -invariant , for all isomorphisms ' : S = S it is R = f'(r) : r 2 Rg For the notion of an isomorphism cf. De nition0 2.2.6. We denote the expansions of an L-structure S 0 to L(P) and L(P; Q) by (S 0 ; P S ) and (S 0 ; P S 0 ; QS 0 ): Now, let P S = R and F an L(P)-formula, such that for all L-structures S 0T

(S 0 ; P S 0 ) j= F , (S 0 ; P S 0 ) = (S ; P S ) Prove the following claims: a) R is S -invariant and (S 0 ; P S 0 ) = (S 0 ; P S 0 ) ) P S 0 is S -invariant. b) R is S -invariant ) (S 0 ; P S 0 ; QS 0 ) j= F ^ FP (Q) ! 8x(Px $ Qx) c) R is S -invariant , there is an L-formula G with FV(G) fx0g such that R = fs 2 S : S j= G s]g: Hint: Use Beth's de nability theorem. d) For any set X S there is an L-formula G with FV(G) fxg such that f'(s) : s 2 X and ' : S = Sg = fs 2 S : S j= G s]g:

82

I. Pure Logic

E 1.9.6. Prove Proposition 1.9.17. E 1.9.7. It is possible to strengthen Herbrand's lemma in the following way:T j= 9xF then there are nitely many terms t1 ; : : : ; tn such that T j= Fx (t1) _ : : : _ Fx (tn): Which restrictions have to be made on T and F that the above strengthing is correct? If T is a theory with

E 1.9.8. Do we have in general j= F $ FH ; i.e. do we have for all LH -structures S (which are expansions of L-structures) and all S -assignments S j= F $ FH ]?

1.10 First Order Logic with IdentityThe equality of objects in a domain of an L-structure is such a basic property that it should be possible to express it in any logic. Up to now we cannot do that. Therefore we are going to introduce a symbol `=' into the rst order language L and call the extended language the `language L with identity', denoted by LI.

De nition 1.10.1. The `standard' interpretation of `=' in any LI-structure S is given=S is the set f(s; s) : s 2 S g: Since we don't want to repeat all the work we have done up to now for languages with identity, we try to treat `=' as a common binary predicate constant on such to transfer the results of the previous sections to languages with identity. In order to give `=' the special meaning, intended by its standard interpretation, we cannot treat it as an arbitrary binary predicate constant but have to x its meaning by giving de ning axioms for it. by

De nition 1.10.2. The following sentences are the de ning axioms for `='. 1. 8x(x = x) (re exivity) 2. 8x8y(x = y ! y = x) (symmetry) 3. 8x8y8z(x = y ^ y = z ! x = z) (transitivity) 4. 8x1 : : : 8xn 8y1 : : : 8yn (x1 = y1 ^ : : : ^ xn = yn ! fx1 : : :xn = fy1 : : :yn ) for all function symbols f in L (compatibility with functions) 5. 8x1 : : : 8xn 8y1 : : : 8yn (x1 = y1 ^ : : : ^ xn = yn ! (Px1 : : :xn ! Py1 : : :yn )) for all predicate symbols P in L (compatibility with relations).

1.10. First Order Logic with Identity

83

We call the set of de ning axioms for `=' Id. Every axiom of Id is the form 8~ F with x quanti er free F: We de ne Id0 = fF~ (~ ) : BV(F) = ; ^ 8~ F 2 Idg x x t and call Id0 the open version of identity axioms. The open version is apparently as good as Id since we have Proposition 1.10.3. Id0 j= F entails Id j= F: Proof. If Id0 j= F, then we have by compactness and the deduction theorem

j= F1 ^ : : : ^ Fn ! F for F1; : : : ; Fn 2 Id0 :Since j= 8~ Fi ! Fi for i = 1; : : : ; n this entails x

j= 8~ 1F1 ^ : : : ^ 8~ n Fn ! F: x xThus Id j= F by the deduction theorem. In the rest of this section we are going to investigate the possible di erences between the two viewpoints: 1. Taking = as a `logical symbol' and interpreting it standardly in any LI-structure 2. Taking = as a `non-logical symbol' (i.e. a symbol belonging to the set P ) and interpreting it in L-structures which are models of Id: The rst and easiest observation is made by the following proposition. Proposition 1.10.4. For any LI-structure S we obviously have S j= Id: De nition 1.10.5. Let S1 = (S1; C 1 ; F1 ; P1), S2 = (S2 ; C 2 ; F2; P2) be L-structures. We call S2 epimorphic to S1 if there is a mapping ' : S1 ! S2 satisfying the following conditions: 1. ' is onto 2. '(cS ) = cS for all c 2 C 3. '(f S (s1 ; : : : ; sn )) = f S ('(s1 ); : : : ; '(sn )) for all f 2 F 4. (s1 ; : : : ; sn) 2 P S , ('(s1 ); : : : ; '(sn )) 2 P S for all P 2 P : Mappings satisfying 1.{4. are called epimorphisms. In mathematics we meet a lot of epimorphisms. E.g. if S1 ; S2 are groups, i.e. special LGT -structures, ' : S1 ! S2 is an epimorphism in the sense of De nition 1.10.5 i it is a group homomorphism and onto. Here we want the reader also to look at De nitions 2.2.1 and 2.2.6.1 2 1 2 1 2

84

I. Pure Logic

Proposition 1.10.6. Let ' : S1 ! S2 be an epimorphism and an S1-assignment. Then there is an S2 -assignment ' such that 1: tS ' ] = '(tS ]) for all L-terms t 2: Val' (F; ') = Val' (F; ) for all L-formulas F: Proof. Put ' (x) = '( (x)) and check 1. and 2. by induction on the length of t and F; respectively, as an easy exercise. Theorem 1.10.7. Let S be an L-structure satisfying Id: Then there is an LI-structure S epimorphic to S ; i.e. a structure which interprets `=' standardly. Proof. Let S = (S; C ; F; P): We de ne S = (S; C ; F; P) as follows: a = fb 2 S : a =S bg; for a 2 S S = fa : a 2 S g cS = cS for x 2 C f S (s1; : : : ; sn ) = f S (s1 ; : : : ; sn ) for f 2 F and P S = f(a1 ; : : : ; an) : (a1 ; : : : ; an) 2 P S g for P 2 P : Now it is easy to show that 1. f S and P S are well de ned. 2. ' : S ! S '(a) = a is an epimorphism. 3. S interprets = standardly The proofs are straightforward and left as an exercise. In 1.10.4 we have seen that the standard interpretation of `=' is always a model of Id: Combining this with Theorem 1.10.7 we see that there is no essential di erence between our two standpoints mentioned in the beginning of this section. Therefore it should not be di cult to transfer the results on pure logic to logic with identity. First we get: Theorem 1.10.8 (Compactness theorem for logic with identity). Let M be a set of LI-formulas such that every nite subset of M is LI-consistent. Then M is LI-consistent. Proof. Let M0 M be a nite subset of M: Then there is an LI-structure S and an S -assignment such that S j= F ] for any F 2 M0 : Since S j= Id we see that M Id is nitely consistent. By the compactness theorem for pure logic we therefore get an L-structure S 0 and an S 0-assignment 0 with S 0 j= F 0] for all F 2 M and by 1.10.7 ~ ~ and 1.10.6 S 0 and 0 can be boiled down to an LI-structure S and an S -assignment ~ satisfying M: Let us denote by M Id F that S j= M ] ) S j= F ] holds for any LI-structure S and S -assignment : Then we have2 1

1.11. A Tait-Calculus for First Order Logic with Identity

85

by 1.10.4. Theorems 1.10.8 and 1.10.9 tell us that there is no essential di erence between the two standpoints: regarding `=' as a logical or an additional `non-logical' symbol. Therefore we are going to count `=' among the logical symbols. Since we have the compactness theorem for this language all `model theoretic' properties are transferred. By Theorem 1.10.9, however, we see that there is also a calculus producing the valid formulas of rst order logic with identity. We have Id F i Id j= F and all we have to do is to use a calculus for pure rst order logic, e.g. the Hilbert-calculus introduced in section 1.7 and augment its axioms by the set Id: A bit more delicate to transfer are the results obtained by inspecting the Tait-calculus. A priori we cannot be sure that there is also a cut free calculus for rst order logic with identity. We are going to check this in the next section.

Theorem 1.10.9. M Id F i M Id j= F: Proof. We have M Id F i M f:F g is LI-inconsistent, i.e. there is no LI-structure and no S -assignment satisfying the formulas in M f:F g: But then M Id f:F g is L-inconsistent because any L-structure S and any S -assignment satisfying S j= M Id f:F g ] can be boiled down by 1.10.7 to an LI-structure and a corresponding assignment without changing the truth values of the formulas. On the other hand, if M f:F g is LI-consistent, M Id f:F g is L-consistent

ExercisesE 1.10.1.a) Prove: Id j= s1 = t1 ^ : : : ^ sn = tn ! tx ;::: ;xn (s1 ; : : : ; sn ) = tx ;::: ;xn (t1 ; : : : ; tn):1 1

b) Prove: Id j= s1 = t1 ^ : : : ^ sn = tn ! Fx ;::: ;xn (s1 ; : : : ; sn ) = Fx ;::: ;xn (t1 ; : : : ; tn):1 1

E 1.10.2. Prove Proposition 1.10.6. E 1.10.3. Prove Theorem 1.10.7.

1.11 A Tait-Calculus for First Order Logic with IdentitySince we have Id F ) Id j= F by 1.10.9 it is obvious that for any formula F with Id F we get T F1 ; : : : ; Fn; F for nitely many instances F1; : : : ; Fn of axioms in Id: Thus adding axioms T ; G for G 2 Id and the cut rule would immediately give us T F: All the results we got

86

I. Pure Logic

by the Tait-calculus, however, depended heavily on the fact that the calculus was cut free. Therefore we are going to try to cook up a cut free calculus also for predicate logic with identity. In the Tait-language we also have the symbol 6= for inequality. De nition 1.11.1. We de ne calculus Id as follows. We augment the axiom of the Tait-calculus by the additional axiomId

; t = t for any L-term t (Id-axiom) ; Px ;::: ;xn (t1 ; : : : ; tn) imply1

and the rules by the identity ruleId Id

; si = ti for i = 1; : : : ; n and Id ; Px ;::: ;xn (s1 ; : : : ; sn):1

where P denotes an atomic formula of the Tait-language for LI (Id-rule). The other axioms and rules are those of the ordinary Tait-calculus (cf. De nition 1.8.4).

by an Id-axiom or an Id-rule. In the case of an Id-axiom we have (t = t) 2 : Hence (t 6= t) 2 : and : is LI-inconsistent. In case of an Id-rule we have by induction hypothesis the LIinconsistency of : fs1 6= t1; : : : ; sn 6= tng and : fP x ;::: ;xn (t1; : : : ; tn)g: Let S be an LI-structure and an S -assignment. We have to show thatId1

Proposition 1.11.2. If Id , then : is LI-inconsistent. Proof. The proof is essentially that of 1.8.6 with the additional clauses that we have

S 6j= (:If S j= :

fP x ;::: ;xn (s1 ; : : : ; sn )g) ]:1 1

], then S j= (si = ti) ] for i = 1; : : : ; n and

S j= Px ;::: ;xn (t1; : : : ; tn) ]:This immediately entails since S is an LI-structure. Corollary 1.11.3. Id F entails Id F and Id j= F: Proof. By 1.11.2 we rst get Id F and this by 1.10.9 entails Id j= F: Our next aim is the proof of the opposite direction of 1.11.3. To prepare this we need the following lemma.

S j= Px ;::: ;xn (s1 ; : : : ; sn)1

1.11. A Tait-Calculus for First Order Logic with Identity

87

; : Proof. Without loss of generality we may assume that P is not an equation. We show the lemma by induction on the length of the derivation Id ; P: If P 2 ; = we get Id ; by the structural rule and are done. Thus assume P 2 ; : Then Id ; P cannot be an L-axiom. If it is an Id-axiom, then (t = t) 2 for some L-term t because P is not an equation. Then Id ; is an Id-axiom, too. If the last inference is Id i ; P ) Id ; P for i = 0 or i = 0; 1Id

Lemma 1.11.4. Assume

Id

; P and Id

; P for an atomic formula P. Then

then we get Id i ; by induction hypothesis and obtain Id ; by the same inference. Thus the only non-trivial case is that the main formula of the last inference is P: Since P is atomic this can only by an inference according to the Id-rule. But then P = Px ;::: ;xn (s1 ; : : : ; sn) and we have the premises1

(1.21) Id ; s1 = t1 ; : : : ; Id ; sn = tn and (1.22) Id ; Px ;::: ;xn (t1 ; : : : ; tn): We have the Id-axioms (1.23) Id ; t1 = t1 ; : : : ; Id ; tn = tn and obtain from (1.21) and (1.23)1

(1.24) Id ; t1 = s1 ; : : : ; Id ; tn = sn by applications of the Id-rule. From (1.24) and the hypothesis we get (1.25) Id ; Px ;::: ;xn (t1 ; : : : ; tn) by another application of the Id-rule. But (1.25) together with (1.22) yield1

Id

; Px ;::: ;xn (s1 ; : : : ; sn )1

by the induction hypothesis. For the next lemma we introduce the set Idt = f8xk+1 : : : 8xn Fx ;::: ;xk (t1 ; : : :tk ) : F 2 Id0 and t1 ; : : : ; tk are arbitrary L-termsg:1

Id

;

88

I. Pure Logic

in ). In the rst case we get Id as an L-axiom in the second as an Id-axiom. If the last inference is T : ; i ) T : ; for i = 0 or i 2 f0; 1g; then we get Id i by induction hypothesis and deduce Id by the same inference, which is possible because the Tait-calculus with identity comprises pure Tait-calculus. Thus the crucial cases are those in which the main formula F of the last inference belongs to the set : : There we have the following sub-cases. 1. F = 8xG Then we have the premise ` : ; ; Gx(t): But if 8xG 2 Idt we also have Gx(t) 2 Idt and obtain Id by the induction hypothesis. 2. F = (s = t ! t = s) Then it is :F = (s = t ^ t 6= s) and we have the premises ; ; s = t and T : ; ; t 6= s: By the induction hypothesis we getT

Lemma 1.11.5. Assume Idt and T : ; : Then Id : Proof. We induct on the length of the derivation of T : ; : If T : ; is an L-axiom, then either already is an L-axiom or contains an equation t = t for some L-term t such that (t = t) 2 : (these are the only atomic formulas occurring 6

From these we obtain

Id

; s = t and IdId

; t 6= s: ; s = s we get

; s 6= s by an application of the Id-rule. Using an Id-axiom Id Id :

3. F = (s = t ^ t = r ! s = r) Then F = (s = t ^ t = r) ^ s 6= r: Then we have the premises which by ^-inversion (of the exercises) entail that there are derivations which are not longer than that ofT T T

: ; ; s = t ^ t = r and : ; ; s = t andT

T

: ; ; s 6= r

: ; ;t = r

: ; ; s = t ^ t = r:

1.11. A Tait-Calculus for First Order Logic with Identity By induction hypothesis these yield (1.26) (1.27) and (1.28)Id Id Id

89

;s = t ;t = r ; s 6= r

4. F = (s1 = t1 ^ : : : ^ sn = tn ! fs1 : : :sn = ft1 : : :tn ) Then we have the premises (1.29) and (1.30)T

From (1.26) and (1.27) we get Id ; s = r by the Id-rule. This together with (1.28) entails Id by Lemma 1.11.4.

: ; ; s1 = t1 ; : : : ;T

T

: ; ; sn = tn

: ; ; fs1 : : :sn 6= ft1 : : :tn

(where we tacitly use ^-inversion to get (1.29)). By the induction hypothesis these yield (1.31) and (1.32) (1.33) we get (1.34) (1.35) by Lemma 1.11.4.Id Id Id

; s1 = t1 ; : : : ; Id

; sn = tn

; fs1 : : :sn 6= ft1 : : :tn: ; ft1 : : :tn = ft1 : : :tn ; fs1 : : :sn = ft1 : : :tn:Id

From (1.29) and the Id-axiomId

From (1.30) and (1.34) we get

90

I. Pure Logic 5. F = (s1 = t1 ^ : : : ^ sn = tn ! (Ps1 : : :sn ! Pt1 : : :tn)) i.e. F = (s1 = t1 ^ : : : ^ sn = tn ^ Ps1 : : :sn ^ Pt1 : : :tn ): Then we have the premises (again using ^-inversion) (1.36) T : ; ; s1 = t1 ; : : : ; T : ; ; sn = tn (1.37) T : ; ; Ps1 : : :sn and (1.38) T : ; ; Pt1 : : :tn : We apply the induction hypothesis to (1.36), (1.37) and (1.38) and get by an application of the Id-rule Id ; Ps1 : : :sn and Id ; Ps1 : : :sn

Proof. The direction from right to left is Corollary 1.11.3. Thus assume Id F: Then by 1.10.9 we have Id j= F which by the compactness and deduction theorem entails j= F1 ^ : : : ^ Fn ! F for formulas F1; : : : ; Fn 2 Id Idt : By the completeness theorem for the Tait-calculus this yields T F1 ; : : : ; Fn; F and by 1.11.5 we nally get Id F: Lemma 1.11.7 (Herbrand's lemma for logic with identity). If F is an 9-formula with Id 9xF , then there are nitely many L-terms t1; : : : ; tn such that Proof. AssumeId

which again by Lemma 1.11.4 entails Id Theorem 1.11.6. We have Id F i Id F:

:

9xF: Then

Id Fx (t1 ) _ : : : _ Fx (tn ):

F1; : : : ; Fn; 9xF for fF1; : : : ; Fng Id by 1.10.9, the deduction theorem and the completeness theorem for the Tait-calculus. All formulas in Id are 8-formulas. Hence F1; : : : ; Fn are 9-formulas and we may apply Lemma 1.9.2 to get nitely many terms t1 ; : : : ; tn such that T F1; : : : ; Fn; Fx(t1 ); : : : ; Fx(tn): This together with Lemma 1.11.5 entails Id Fx (t1 ) _ : : : _ Fx(tn ): Lemma 1.11.7 entails Herbrand's theorem for predicate logic with identity in the same way as Lemma 1.9.2 did it in the case of pure logic. All the proofs, including the construction of the prenex form of a formula, can be literally transferred.T

1.11. A Tait-Calculus for First Order Logic with Identity k-tuples

91

Theorem 1.11.8 (Herbrand's theorem for logic with identity). If F is an Lsentence and FH = 9x1 : : : 9xk G: Then we have Id F i there are nitely many(t1 ; : : : ; t1 ); : : : ; (tn; : : : ; tn) 1 k 1 k1

of LH -terms such thatId1

Gx ;::: ;xk (t1 ; : : : ; t1 ) _ : : : _ Gx ;::: ;xk (tn ; : : : ; tn): 1 k 1 k

Another application of Lemma 1.11.5 is the interpolation theorem for predicate logic with identity. In a language with identity we can even avoid the addition of the empty formula, which was needed to state the interpolation theorem for pure logic in a general setting (cf. Theorem 1.9.13). We observe that the formula t = t for closed terms t behaves like > and dually t 6= t like ?: We have (1.39) for any formula setId Id

;t = t

by an Id-axiom and

; t 6= t ) Id by (1.1) and Lemma 1.11.4. Thus we get the general interpolation theorem for the calculus Id . Of course, no longer we can count `=' among the non-logical symbols, because t 6= t, in the role of the empty formula, must not contain a non-logical predicate symbol. Theorem 1.11.9 (Interpolation theorem for logic with identity). If we have Id F ! G, then there is an interpolation formula E for F and G, i.e. we have Id F ! E and Id E ! G and E contains at most those non-logical predicate constants positively (negatively) which occur simultaneously positively (negatively) in F and G: Proof. By the above remark and Theorem 1.11.6 it is just the same as the proof of 1.9.14 by a slight modi cation of the proof of Theorem 1.9.9 for the calculus Id in the axiom case and for the Id-rule. We will now leave these more syntactical investigations and turn to the fundamentals of model theory.

92

I. Pure Logic

Chapter 2 Fundamentals of Model TheoryThe objects of interest in model theory are structures and classes of structures. Here connections to rst order logic will be studied. Especially it is analysed if and how structures and classes of structures can be described within rst order logic. In this chapter we will only deal with predicate logic with identity. Therefore there will be no need to emphasise that. So we are just writing j= instead of Id etc.

2.1 Conservative Extensions and Extensions by De nitionsDe nition 2.1.1. Let L be a rst order language. An L-theory is a set of L-sentences. L-theories are denoted by T; T0 ; : : : If F is an L-formula and T a theory such that T j= F, then F is called provable in T: We call the sentences provable in T theorems of T: The language of a theory T is L(T ) = L(CT ; FT ; PT ) where CT ; FT and PT are the sets of constant, function and predicate symbols which occur in sentences of T . A structure S which satis es all sentences of T is called a T -model. Here we give our standard example. Let LGT be the language of group theory (cf. section 1.1). Then AxGT { the set of the following formulas

8x8y8z(x (y z) = (x y) z) 8x(x 1 = x) 8x9y(x y = 1)is an LGT -theory. An AxGT -model is just a group. Because in every group G we have G j= 8x(1 x = x) it is 8x(1 x = x) a theorem of AxGT . As there are also non-commutative groups

8x8y(x y = y x)is not a theorem of AxGT :

94

II. Model Theory

for the inverse function. That is we set AxGT = AxGT f8x(x x 1 = 1)g: Our intuitive understanding of groups may say that AxGT is a conservative extension of AxGT , i.e. AxGT doesn't prove more LGT -sentences than AxGT does. That this impression is correct can be obtained by the following result. Theorem 2.1.3. Assume that T and T 0 are theories such that every T-model expands to a T 0 -model. Then T 0 is conservative over T: Proof. Let L = L(T ) and L0 = L(T 0): Let F be an L-sentence such that T 0 j= F: If S is a T model expand it to a T 0 model S 0. Then S 0 j= F: But S is the L-retract of S 0 and F is in L: Thus S j= F: Hence T j= F: We had already an example of a conservative extension which was LI and L: Since any L-structure expands to an LI-structure satisfying Id we re-obtain the result that predicate logic with identity is a conservative extension of pure logic. The above example of the inverse function in group theory gives reason for the following question: In mathematics it is usual to de ne new functions and relations and prove theorems using those de nitions. Usually we assume that there is no di erence if we use those new de nitions or not. By the following de nition we make precise what we usually do in mathematics and the next theorem justi es mathematical practice. De nition 2.1.4. Let T be a theory with language L = L(T): A language L L0 = L(C 0 ; F 0; P 0) is called an extension of T by de nitions if the following conditions are satis ed: 1. For every c 2 C 0 n C there is an L(T)-formula F c such that c FV(F c ) = fxg and T j= 9x(F c ^ (8y(Fx (y) ! y = x))); brie y denoted by T j= 9!xF c(x): 2. For every f 2 F 0 n F with #f = n there is an L(T)-formula F f such that FV(F f ) = fx1; : : : ; xn; yg and T j= 8x1 : : : 8xn9!yF f : 3. For every P 2 P 0 n P with #P = n there is an L(T)-formula F P such that FV(F P ) = fx1; : : : ; xng:

De nition 2.1.2. Assume that L L0 are rst order languages. a) An L0-theory T 0 is an extension of an L-theory T if every T-theorem also is an T 0-theorem, i.e. T j= F ) T 0 j= F: b) An extension T 0 of T is conservative, if for all L-sentences we also have T 0 j= F ) T j= F: Now think of the language LGT LGT of section 1.5. In LGT we have an unaryfunction symbol1

2.1. Conservative Extensions and Extensions by De nitions

95

If L0 is an extension by de nitions of a theory T, then T (L0 ) is the theory which contains the following sentences: 1. all sentences in T; i.e. T T(L0) c 2. all sentences Fx (c) for c 2 C 0 n C 3. all sentences 8x1 : : : 8xnFyf (fx1 : : :xn) for f 2 F 0 n F 4. all sentences 8x1 : : : 8xn(Px1 : : :xn $ F P ) for P 2 P 0 n P : We have seen LGT to be an extension of AxGT by de nition. Theorem 2.1.5. If L0 is an extension of T by de nitions, then T(L0) is conservative over T: Proof. Let S = (S; C ; F; P) be a T-model. We have to expand S to a T(L0 )-model S 0 . To obtain an L0-structure we rst have to interpret the symbols in L0 which do not belong to L: 1. If c 2 C 0 n C , then there is a formula F c such that T j= 9!xF c: Because of S j= T we also have S j= 9!xF c which entails that 0there is an uniquely determined c element s 2 S such that S j= Fx s]: We put cS = s: Then we obviously get (2.1) S 0 j= Fxc(c): 2. Let f 2 F 0 n F and #f = n: Then there is an L(T)-formula F f such that (2.2) S j= 8x1 : : : 8xn9!yF f : We de ne f S 0 (s1 ; : : : ; sn ) = t if S j= F f s1 ; : : : ; sn ; t]: This de nes a function because for arbitrary s1 ; : : : ; sn 2 S there is exactly one t such that S j= F f s1 ; : : : ; sn; t] by (2.2). Then we obviously have (2.3) S 0 j= 8x1 : : : 8xnFyf (fx1 : : :xn): 3. For P 2 P 0 n P with #P = n we have an L(T)-formula F P such that FV(F P ) = fx1; : : : ; xng: De ne P S 0 = f(s1 ; : : : ; sn ) 2 S n : S j= F P s1 ; : : : ; sn ]g Then we get (2.4) S 0 j= 8x1 : : : 8xn(Px1 : : :xn $ F P ): From (2.1), (2.3) and (2.4) we get that S 0 is a T(L0)-model. Thus by 2.1.3 T(L0) is a conservative extension of T: Now we can strengthen Theorem 2.1.5 in such a way that we have for every formula of the extension by de nitions L0 of a theory T an equivalent L(T)-formula. I.e. it is possible to replace the new de ned symbols by their de nitions.

96

II. Model Theory

that

Theorem 2.1.6. Let L0 be an extension of T by de