CS 416Artificial Intelligence
Lecture 10Lecture 10
LogicLogic
Chapters 7 and 8Chapters 7 and 8
Lecture 10Lecture 10
LogicLogic
Chapters 7 and 8Chapters 7 and 8
Homework Assignment
First homework assignment is outFirst homework assignment is out
Due October 20thDue October 20th
First homework assignment is outFirst homework assignment is out
Due October 20thDue October 20th
Midterm
October 25October 25thth
Up through chapter 9 (excluding chapter 5)Up through chapter 9 (excluding chapter 5)
October 25October 25thth
Up through chapter 9 (excluding chapter 5)Up through chapter 9 (excluding chapter 5)
Review
Store information in a knowledge baseStore information in a knowledge base
• Backus-Naur Form (BNF)Backus-Naur Form (BNF)
• Use equivalences to isolate what you wantUse equivalences to isolate what you want
• Use inference to relate sentencesUse inference to relate sentences
– Modus PonensModus Ponens
– And-EliminationAnd-Elimination
Store information in a knowledge baseStore information in a knowledge base
• Backus-Naur Form (BNF)Backus-Naur Form (BNF)
• Use equivalences to isolate what you wantUse equivalences to isolate what you want
• Use inference to relate sentencesUse inference to relate sentences
– Modus PonensModus Ponens
– And-EliminationAnd-Elimination
Constructing a proof
Proving Proving is like is like searchingsearching
• Find sequence of logical inference rules that lead to desired Find sequence of logical inference rules that lead to desired resultresult
• Note the explosion of propositionsNote the explosion of propositions
– Good proof methods ignore the countless irrelevant Good proof methods ignore the countless irrelevant propositionspropositions
The two inference rules areThe two inference rules are sound sound but notbut not complete! complete!
Proving Proving is like is like searchingsearching
• Find sequence of logical inference rules that lead to desired Find sequence of logical inference rules that lead to desired resultresult
• Note the explosion of propositionsNote the explosion of propositions
– Good proof methods ignore the countless irrelevant Good proof methods ignore the countless irrelevant propositionspropositions
The two inference rules areThe two inference rules are sound sound but notbut not complete! complete!
Recall that a Recall that a completecomplete inference algorithm is one inference algorithm is one that can derive any sentence that is entailedthat can derive any sentence that is entailed
Resolution is a single inference ruleResolution is a single inference rule
• Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed
– i.e. it is completei.e. it is complete
• It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm
Recall that a Recall that a completecomplete inference algorithm is one inference algorithm is one that can derive any sentence that is entailedthat can derive any sentence that is entailed
Resolution is a single inference ruleResolution is a single inference rule
• Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed
– i.e. it is completei.e. it is complete
• It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm
Completeness
Resolution
Resolution is a single inference ruleResolution is a single inference rule
• Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed
– i.e. it is completei.e. it is complete
• It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm
Resolution is a single inference ruleResolution is a single inference rule
• Guarantees the ability to derive any sentence that is entailedGuarantees the ability to derive any sentence that is entailed
– i.e. it is completei.e. it is complete
• It must be partnered with a complete search algorithmIt must be partnered with a complete search algorithm
Resolution
Unit Resolution Inference RuleUnit Resolution Inference Rule
• If If mm and and llii are are
complementarycomplementaryliteralsliterals
Unit Resolution Inference RuleUnit Resolution Inference Rule
• If If mm and and llii are are
complementarycomplementaryliteralsliterals
Resolution Inference Rule
Also works with clausesAlso works with clauses
But make sure each literal appears only onceBut make sure each literal appears only once
We really just want the resolution to return AWe really just want the resolution to return A
Also works with clausesAlso works with clauses
But make sure each literal appears only onceBut make sure each literal appears only once
We really just want the resolution to return AWe really just want the resolution to return A
Resolution and completeness
Any complete search algorithm, applying only the Any complete search algorithm, applying only the resolution rule, can derive any conclusion resolution rule, can derive any conclusion entailed by any knowledge base in propositional entailed by any knowledge base in propositional logiclogic
• More specifically, refutation completenessMore specifically, refutation completeness
– Able to confirm or refute any sentenceAble to confirm or refute any sentence
– Unable to enumerate all true sentencesUnable to enumerate all true sentences
Any complete search algorithm, applying only the Any complete search algorithm, applying only the resolution rule, can derive any conclusion resolution rule, can derive any conclusion entailed by any knowledge base in propositional entailed by any knowledge base in propositional logiclogic
• More specifically, refutation completenessMore specifically, refutation completeness
– Able to confirm or refute any sentenceAble to confirm or refute any sentence
– Unable to enumerate all true sentencesUnable to enumerate all true sentences
What about “and” clauses?
Resolution only applies to “or” clauses (disjunctions)Resolution only applies to “or” clauses (disjunctions)
• Every sentence of propositional logic can be transformed to a Every sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literalslogically equivalent conjunction of disjunctions of literals
Conjunctive Normal Form (CNF)Conjunctive Normal Form (CNF)
• A sentence expressed as conjunction of disjunction of literalsA sentence expressed as conjunction of disjunction of literals
Resolution only applies to “or” clauses (disjunctions)Resolution only applies to “or” clauses (disjunctions)
• Every sentence of propositional logic can be transformed to a Every sentence of propositional logic can be transformed to a logically equivalent conjunction of disjunctions of literalslogically equivalent conjunction of disjunctions of literals
Conjunctive Normal Form (CNF)Conjunctive Normal Form (CNF)
• A sentence expressed as conjunction of disjunction of literalsA sentence expressed as conjunction of disjunction of literals
CNF
An algorithm for resolution
We wish to prove KB entails We wish to prove KB entails • That is, will some sequence of inferences create new That is, will some sequence of inferences create new
sentences in knowledge base equal to sentences in knowledge base equal to
• If some sequence of inferences created ~ If some sequence of inferences created ~ we would know we would know the knowledge base could not also entail the knowledge base could not also entail
– ^ ~ ^ ~ is impossibleis impossible
We wish to prove KB entails We wish to prove KB entails • That is, will some sequence of inferences create new That is, will some sequence of inferences create new
sentences in knowledge base equal to sentences in knowledge base equal to
• If some sequence of inferences created ~ If some sequence of inferences created ~ we would know we would know the knowledge base could not also entail the knowledge base could not also entail
– ^ ~ ^ ~ is impossibleis impossible
An algorithm for resolution
To show KB cannot resolve To show KB cannot resolve • Must show (KB ^ ~Must show (KB ^ ~) is ) is unsatisfiableunsatisfiable
– No possible way for KB to entail (not No possible way for KB to entail (not ))
– Proof by contradictionProof by contradiction
To show KB cannot resolve To show KB cannot resolve • Must show (KB ^ ~Must show (KB ^ ~) is ) is unsatisfiableunsatisfiable
– No possible way for KB to entail (not No possible way for KB to entail (not ))
– Proof by contradictionProof by contradiction
An algorithm for resolution
AlgorithmAlgorithm• (KB ^ ~(KB ^ ~is put in CNFis put in CNF
• Each pair with complementary literals is resolved to produce Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel)new clause which is added to KB (if novel)
– Cease if no new clauses to add (~Cease if no new clauses to add (~ is not entailed) is not entailed)
– Cease if resolution rule derives empty clauseCease if resolution rule derives empty clause
If resolution generates If resolution generates … … ^ ~ ^ ~ = empty clause = empty clause
(~(~ is entailed) is entailed)
AlgorithmAlgorithm• (KB ^ ~(KB ^ ~is put in CNFis put in CNF
• Each pair with complementary literals is resolved to produce Each pair with complementary literals is resolved to produce new clause which is added to KB (if novel)new clause which is added to KB (if novel)
– Cease if no new clauses to add (~Cease if no new clauses to add (~ is not entailed) is not entailed)
– Cease if resolution rule derives empty clauseCease if resolution rule derives empty clause
If resolution generates If resolution generates … … ^ ~ ^ ~ = empty clause = empty clause
(~(~ is entailed) is entailed)
Example of resolution
Proof that there is not a pit in PProof that there is not a pit in P1,21,2: ~P: ~P1,21,2
• KB ^ PKB ^ P1,21,2 leads to empty clause leads to empty clause
• Therefore ~PTherefore ~P1,21,2 is true is true
Proof that there is not a pit in PProof that there is not a pit in P1,21,2: ~P: ~P1,21,2
• KB ^ PKB ^ P1,21,2 leads to empty clause leads to empty clause
• Therefore ~PTherefore ~P1,21,2 is true is true
Formal Algorithm
Horn Clauses
Horn ClauseHorn Clause
• Disjunction of literals where at most one is positiveDisjunction of literals where at most one is positive
– (~a V ~b V ~c V d)(~a V ~b V ~c V d)
– (~a V b V c V ~d) (~a V b V c V ~d) Not a Horn ClauseNot a Horn Clause
Horn ClauseHorn Clause
• Disjunction of literals where at most one is positiveDisjunction of literals where at most one is positive
– (~a V ~b V ~c V d)(~a V ~b V ~c V d)
– (~a V b V c V ~d) (~a V b V c V ~d) Not a Horn ClauseNot a Horn Clause
Horn Clauses
Why?Why?
• Every Horn clause can be written as an implication:Every Horn clause can be written as an implication:
– (a conjunction of positive literals) (a conjunction of positive literals) single positive literalsingle positive literal
– Inference algorithms can use forward chaining and Inference algorithms can use forward chaining and backward chainingbackward chaining
– Entailment computation is linear in size of KBEntailment computation is linear in size of KB
Why?Why?
• Every Horn clause can be written as an implication:Every Horn clause can be written as an implication:
– (a conjunction of positive literals) (a conjunction of positive literals) single positive literalsingle positive literal
– Inference algorithms can use forward chaining and Inference algorithms can use forward chaining and backward chainingbackward chaining
– Entailment computation is linear in size of KBEntailment computation is linear in size of KB
Horn Clauses
Can be written as a special implicationCan be written as a special implication
• (~a V ~b V c) becomes (a ^ b) => c(~a V ~b V c) becomes (a ^ b) => c
– (~a V ~b V c) == (~(a ^ b) V c) … de Morgan(~a V ~b V c) == (~(a ^ b) V c) … de Morgan
– (~(a ^ b) V c) == ((a ^ b) => c … implication elimination(~(a ^ b) V c) == ((a ^ b) => c … implication elimination
Can be written as a special implicationCan be written as a special implication
• (~a V ~b V c) becomes (a ^ b) => c(~a V ~b V c) becomes (a ^ b) => c
– (~a V ~b V c) == (~(a ^ b) V c) … de Morgan(~a V ~b V c) == (~(a ^ b) V c) … de Morgan
– (~(a ^ b) V c) == ((a ^ b) => c … implication elimination(~(a ^ b) V c) == ((a ^ b) => c … implication elimination
Forward Chaining
Does KB (Horn clauses) entail q (single symbol)?Does KB (Horn clauses) entail q (single symbol)?
• Keep track of known facts (positive literals)Keep track of known facts (positive literals)
• If the premises of an implication are knownIf the premises of an implication are known
– Add the conclusion to the known set of factsAdd the conclusion to the known set of facts
• Repeat process until no further inferences or q is addedRepeat process until no further inferences or q is added
Does KB (Horn clauses) entail q (single symbol)?Does KB (Horn clauses) entail q (single symbol)?
• Keep track of known facts (positive literals)Keep track of known facts (positive literals)
• If the premises of an implication are knownIf the premises of an implication are known
– Add the conclusion to the known set of factsAdd the conclusion to the known set of facts
• Repeat process until no further inferences or q is addedRepeat process until no further inferences or q is added
Forward Chaining
Forward Chaining
PropertiesProperties
• SoundSound
• CompleteComplete
– All entailed atomic sentences will be derivedAll entailed atomic sentences will be derived
Data DrivenData Driven
• Start with what we knowStart with what we know
• Derive new info until we discover what we wantDerive new info until we discover what we want
PropertiesProperties
• SoundSound
• CompleteComplete
– All entailed atomic sentences will be derivedAll entailed atomic sentences will be derived
Data DrivenData Driven
• Start with what we knowStart with what we know
• Derive new info until we discover what we wantDerive new info until we discover what we want
Backward Chaining
Start with what you want to know, a query (q)Start with what you want to know, a query (q)
• Look for implications that conclude qLook for implications that conclude q
– Look at the premises of those implicationsLook at the premises of those implications
Look for implications that conclude those premises…Look for implications that conclude those premises…
Goal-Directed ReasoningGoal-Directed Reasoning
• Can be less complex search than forward chainingCan be less complex search than forward chaining
Start with what you want to know, a query (q)Start with what you want to know, a query (q)
• Look for implications that conclude qLook for implications that conclude q
– Look at the premises of those implicationsLook at the premises of those implications
Look for implications that conclude those premises…Look for implications that conclude those premises…
Goal-Directed ReasoningGoal-Directed Reasoning
• Can be less complex search than forward chainingCan be less complex search than forward chaining
Making it fast
ExampleExample
• Problem to solve is in CNFProblem to solve is in CNF
– Is Marvin a Martian given --- M == 1 (true)?Is Marvin a Martian given --- M == 1 (true)?
Marvin is green --- G=1Marvin is green --- G=1
Marvin is little --- L=1Marvin is little --- L=1
(little and green) implies Martian --- (L ^ G) => M(little and green) implies Martian --- (L ^ G) => M ~(L^G) V M ~(L^G) V M ~L V ~G V M ~L V ~G V M
– Proof by contradiction… are there true/false values for G, L, and M that are Proof by contradiction… are there true/false values for G, L, and M that are consistent with knowledge base and Marvin not being a Martian?consistent with knowledge base and Marvin not being a Martian?
G ^ L ^ (~L V ~G V M) ^ ~M == empty? G ^ L ^ (~L V ~G V M) ^ ~M == empty?
ExampleExample
• Problem to solve is in CNFProblem to solve is in CNF
– Is Marvin a Martian given --- M == 1 (true)?Is Marvin a Martian given --- M == 1 (true)?
Marvin is green --- G=1Marvin is green --- G=1
Marvin is little --- L=1Marvin is little --- L=1
(little and green) implies Martian --- (L ^ G) => M(little and green) implies Martian --- (L ^ G) => M ~(L^G) V M ~(L^G) V M ~L V ~G V M ~L V ~G V M
– Proof by contradiction… are there true/false values for G, L, and M that are Proof by contradiction… are there true/false values for G, L, and M that are consistent with knowledge base and Marvin not being a Martian?consistent with knowledge base and Marvin not being a Martian?
G ^ L ^ (~L V ~G V M) ^ ~M == empty? G ^ L ^ (~L V ~G V M) ^ ~M == empty?
Searching for variable values
Want to find values such that:Want to find values such that:
• Randomly consider all true/false assignments to variables Randomly consider all true/false assignments to variables until we exhaust them all or find match (until we exhaust them all or find match (model checkingmodel checking))
– (G, L, M) = (1, 0, 0)… no(G, L, M) = (1, 0, 0)… no = (0, 1, 0)… no = (0, 1, 0)… no = (0, 0, 0)… no = (0, 0, 0)… no = (1, 1, 0)… no = (1, 1, 0)… no
• Alternatively…Alternatively…
Want to find values such that:Want to find values such that:
• Randomly consider all true/false assignments to variables Randomly consider all true/false assignments to variables until we exhaust them all or find match (until we exhaust them all or find match (model checkingmodel checking))
– (G, L, M) = (1, 0, 0)… no(G, L, M) = (1, 0, 0)… no = (0, 1, 0)… no = (0, 1, 0)… no = (0, 0, 0)… no = (0, 0, 0)… no = (1, 1, 0)… no = (1, 1, 0)… no
• Alternatively…Alternatively…
G ^ L ^ (~L V ~G V M) ^ ~M == 0G ^ L ^ (~L V ~G V M) ^ ~M == 0
Backtracking Algorithm
Davis-Putnam Algorithm (DPLL)Davis-Putnam Algorithm (DPLL)
• Search through possible assignments to (G, L, M) via depth-first search Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)…(0, 0, 0) to (0, 0, 1) to (0, 1, 0)…
– Each clause of CNF must be trueEach clause of CNF must be true
Terminate consideration when clause evaluates to falseTerminate consideration when clause evaluates to false
– Use heuristics to reduce repeated computation of propositionsUse heuristics to reduce repeated computation of propositions
Early terminationEarly termination
Pure symbol heuristicPure symbol heuristic
Unit clause heuristicUnit clause heuristic
Davis-Putnam Algorithm (DPLL)Davis-Putnam Algorithm (DPLL)
• Search through possible assignments to (G, L, M) via depth-first search Search through possible assignments to (G, L, M) via depth-first search (0, 0, 0) to (0, 0, 1) to (0, 1, 0)…(0, 0, 0) to (0, 0, 1) to (0, 1, 0)…
– Each clause of CNF must be trueEach clause of CNF must be true
Terminate consideration when clause evaluates to falseTerminate consideration when clause evaluates to false
– Use heuristics to reduce repeated computation of propositionsUse heuristics to reduce repeated computation of propositions
Early terminationEarly termination
Pure symbol heuristicPure symbol heuristic
Unit clause heuristicUnit clause heuristic
GL
M
Cull branches from tree
Searching for variable values
Other ways to find (G, L, M) assignments for:Other ways to find (G, L, M) assignments for: G ^ L ^ (~L V ~G V M) ^ ~M == 0G ^ L ^ (~L V ~G V M) ^ ~M == 0
• Simulated Annealing (WalkSAT)– Start with initial guess (0, 1, 1)– With each iteration, pick an unsatisfied clause and flip one
symbol in the clause– Evaluation metric is the number of clauses that evaluate to true– Move “in direction” of guesses that cause more clauses to be
true– Many local mins, use lots of randomness
WalkSAT termination
How do you know when simulated annealing is How do you know when simulated annealing is done?done?
• No way to know with certainty that an answer is not possibleNo way to know with certainty that an answer is not possible
– Could have been bad luckCould have been bad luck
– Could be there really is no answerCould be there really is no answer
– Establish a max number of iterations and go with best Establish a max number of iterations and go with best answer to that pointanswer to that point
How do you know when simulated annealing is How do you know when simulated annealing is done?done?
• No way to know with certainty that an answer is not possibleNo way to know with certainty that an answer is not possible
– Could have been bad luckCould have been bad luck
– Could be there really is no answerCould be there really is no answer
– Establish a max number of iterations and go with best Establish a max number of iterations and go with best answer to that pointanswer to that point
So how well do these work?
Think about it this wayThink about it this way
• 16 of 32 possible assignments are models (are satisfiable) 16 of 32 possible assignments are models (are satisfiable) for this sentencefor this sentence
• Therefore, 2 random guesses should find a solutionTherefore, 2 random guesses should find a solution
• WalkSAT and DPLL should work quicklyWalkSAT and DPLL should work quickly
Think about it this wayThink about it this way
• 16 of 32 possible assignments are models (are satisfiable) 16 of 32 possible assignments are models (are satisfiable) for this sentencefor this sentence
• Therefore, 2 random guesses should find a solutionTherefore, 2 random guesses should find a solution
• WalkSAT and DPLL should work quicklyWalkSAT and DPLL should work quickly
What about more clauses?
If # symbols (variables) stays the same, but If # symbols (variables) stays the same, but number of clauses increasesnumber of clauses increases
• More ways for an assignment to fail (on any one clause)More ways for an assignment to fail (on any one clause)
• More searching through possible assignments is neededMore searching through possible assignments is needed
• Let’s create a ratio, m/n, to measure #clauses / # symbolsLet’s create a ratio, m/n, to measure #clauses / # symbols
• We expect large m/n causes slower solutionWe expect large m/n causes slower solution
If # symbols (variables) stays the same, but If # symbols (variables) stays the same, but number of clauses increasesnumber of clauses increases
• More ways for an assignment to fail (on any one clause)More ways for an assignment to fail (on any one clause)
• More searching through possible assignments is neededMore searching through possible assignments is needed
• Let’s create a ratio, m/n, to measure #clauses / # symbolsLet’s create a ratio, m/n, to measure #clauses / # symbols
• We expect large m/n causes slower solutionWe expect large m/n causes slower solution
What about more clauses
Higher m/n means fewer Higher m/n means fewer assignments will workassignments will work
If fewer assignments workIf fewer assignments work it is harder for DPLL and it is harder for DPLL and WalkSAT WalkSAT
Higher m/n means fewer Higher m/n means fewer assignments will workassignments will work
If fewer assignments workIf fewer assignments work it is harder for DPLL and it is harder for DPLL and WalkSAT WalkSAT
Combining it all
4x4 Wumpus World4x4 Wumpus World
• The “physics” of the gameThe “physics” of the game
–
–
• At least one wumpus on boardAt least one wumpus on board
–
• A most one wumpus on board (for any two squares, one is free)A most one wumpus on board (for any two squares, one is free)
– n(n-1)/2 rules like: ~Wn(n-1)/2 rules like: ~W1,11,1 V ~W V ~W1,21,2
• Total of 155 sentences containing 64 distinct symbolsTotal of 155 sentences containing 64 distinct symbols
4x4 Wumpus World4x4 Wumpus World
• The “physics” of the gameThe “physics” of the game
–
–
• At least one wumpus on boardAt least one wumpus on board
–
• A most one wumpus on board (for any two squares, one is free)A most one wumpus on board (for any two squares, one is free)
– n(n-1)/2 rules like: ~Wn(n-1)/2 rules like: ~W1,11,1 V ~W V ~W1,21,2
• Total of 155 sentences containing 64 distinct symbolsTotal of 155 sentences containing 64 distinct symbols
Wumpus World
Inefficiencies as world becomes largeInefficiencies as world becomes large
• Knowledge base must expand if size of world expandsKnowledge base must expand if size of world expands
Preferred to have sentences that apply to all Preferred to have sentences that apply to all squaressquares
• We brought this subject up last weekWe brought this subject up last week
• Next chapter addresses this issueNext chapter addresses this issue
Inefficiencies as world becomes largeInefficiencies as world becomes large
• Knowledge base must expand if size of world expandsKnowledge base must expand if size of world expands
Preferred to have sentences that apply to all Preferred to have sentences that apply to all squaressquares
• We brought this subject up last weekWe brought this subject up last week
• Next chapter addresses this issueNext chapter addresses this issue
Chapter 8: First-Order Logic
What do we like about propositional logic?
It is:It is:• DeclarativeDeclarative
– Relationships between variables are describedRelationships between variables are described
– A method for propagating relationshipsA method for propagating relationships
• ExpressiveExpressive
– Can represent partial information using disjunctionCan represent partial information using disjunction
• CompositionalCompositional
– If If A A means foo and means foo and B B means bar, means bar, A ^ BA ^ B means foo and bar means foo and bar
It is:It is:• DeclarativeDeclarative
– Relationships between variables are describedRelationships between variables are described
– A method for propagating relationshipsA method for propagating relationships
• ExpressiveExpressive
– Can represent partial information using disjunctionCan represent partial information using disjunction
• CompositionalCompositional
– If If A A means foo and means foo and B B means bar, means bar, A ^ BA ^ B means foo and bar means foo and bar
What don’t we like about propositional logic?
Lacks expressive power to describe the Lacks expressive power to describe the environment conciselyenvironment concisely
• Separate rules for every square/square relationship in Separate rules for every square/square relationship in Wumpus worldWumpus world
Lacks expressive power to describe the Lacks expressive power to describe the environment conciselyenvironment concisely
• Separate rules for every square/square relationship in Separate rules for every square/square relationship in Wumpus worldWumpus world
Natural Language
English appears to be expressiveEnglish appears to be expressive
• Squares adjacent to pits are breezySquares adjacent to pits are breezy
But natural language is a medium of communication, not a But natural language is a medium of communication, not a knowledge representationknowledge representation
• Much of the information and logic conveyed by language is dependent on Much of the information and logic conveyed by language is dependent on contextcontext
• Information exchange is not well definedInformation exchange is not well defined
• Not compositional (combining sentences may mean something different)Not compositional (combining sentences may mean something different)
• It is ambiguousIt is ambiguous
English appears to be expressiveEnglish appears to be expressive
• Squares adjacent to pits are breezySquares adjacent to pits are breezy
But natural language is a medium of communication, not a But natural language is a medium of communication, not a knowledge representationknowledge representation
• Much of the information and logic conveyed by language is dependent on Much of the information and logic conveyed by language is dependent on contextcontext
• Information exchange is not well definedInformation exchange is not well defined
• Not compositional (combining sentences may mean something different)Not compositional (combining sentences may mean something different)
• It is ambiguousIt is ambiguous
But we borrow representational ideas from natural language
Natural language syntaxNatural language syntax• Nouns and noun phrases refer to Nouns and noun phrases refer to objectsobjects
– People, houses, carsPeople, houses, cars
• Properties Properties and verbs refer to object and verbs refer to object relationsrelations
– Red, round, nearby, eatenRed, round, nearby, eaten
• Some relationships are clearly defined Some relationships are clearly defined functions functions where there is only one where there is only one output for a given inputoutput for a given input
– Best friend, first thing, plusBest friend, first thing, plus
We build first-order logic around objects and relationsWe build first-order logic around objects and relations
Natural language syntaxNatural language syntax• Nouns and noun phrases refer to Nouns and noun phrases refer to objectsobjects
– People, houses, carsPeople, houses, cars
• Properties Properties and verbs refer to object and verbs refer to object relationsrelations
– Red, round, nearby, eatenRed, round, nearby, eaten
• Some relationships are clearly defined Some relationships are clearly defined functions functions where there is only one where there is only one output for a given inputoutput for a given input
– Best friend, first thing, plusBest friend, first thing, plus
We build first-order logic around objects and relationsWe build first-order logic around objects and relations
Ontology
• a “a “specification of a conceptualizationspecification of a conceptualization””
• A description of the objects and relationships that can existA description of the objects and relationships that can exist
– Propositional logic had only true/false relationshipsPropositional logic had only true/false relationships
– First-order logic has many more relationshipsFirst-order logic has many more relationships
• The The ontological commitmentontological commitment of languages is different of languages is different
– How much can you infer from what you know?How much can you infer from what you know?
Temporal logicTemporal logic defines additional ontological defines additional ontological commitments because of timing constraintscommitments because of timing constraints
• a “a “specification of a conceptualizationspecification of a conceptualization””
• A description of the objects and relationships that can existA description of the objects and relationships that can exist
– Propositional logic had only true/false relationshipsPropositional logic had only true/false relationships
– First-order logic has many more relationshipsFirst-order logic has many more relationships
• The The ontological commitmentontological commitment of languages is different of languages is different
– How much can you infer from what you know?How much can you infer from what you know?
Temporal logicTemporal logic defines additional ontological defines additional ontological commitments because of timing constraintscommitments because of timing constraints