Upload
fulvio-rotella
View
109
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Reasoning in very complex contexts often requires purely deductive reasoning to be supported by a variety of techniques that can cope with incomplete data. Abductive inference allows to guess information that has not been explicitly observed. Since there are many explanations for such guesses, there is the need for assigning a probability to each one. This work exploits logical abduction to produce multiple explanations consistent with a given background knowledge and defines a strategy to prioritize them using their chance of being true. Another novelty is the introduction of probabilistic integrity constraints rather than hard ones. Then we propose a strategy that learns model and parameters from data and exploits our Probabilistic Abductive Proof Procedure to classify never-seen instances. This approach has been tested on some standard datasets showing that it improves accuracy in presence of corruptions and missing data.
Citation preview
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic Abductive Logic Programming
using Possible Worlds
Fulvio Rotella1 and Stefano Ferilli1,2
{fulvio.rotella, stefano.ferilli}@uniba.it
1DIB – Dipartimento di Informatica – Università di Bari
2CILA – Centro Interdipartimentale per la Logica e sue Applicazioni – Università di Bari
XXVIII Convegno Italiano di Logica Computazionale - CILC 201325 September 2013
,Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Motivation
Artificial Intelligence: two approaches
Numerical/statistical
Relational
Strengths and weaknesses
Numerical/statistical
+ handle amount of data+ handle incompleteness and uncertainty- flat representations- no relationships between objects/attributes
Relational
+ complex representations of data+ comprehensibility- no incompleteness- no noise and uncertainty
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Motivation
Problem: Real World data
multi-relational, heterogeneous and semi-structured
noisy and uncertain
Solution: Relational Representations + Probability
Logic Programming
representation language and reasoning strategies
Probabilistic Reasoning
robustness
Solutions
Statistical Relational Learning (SRL) [Getoor, 2002]
Probabilistic Inductive Logic Programming (PILP) [Raedt and Kersting, 2004]
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Problems : High degree of complexity
lack and incompleteness of observations
deductive reasoning not enough
Solution: Exploit Abduction!
Abductive statement: given an observation thatcan not be derived in the theory, make assumptionsthat explain it
All the beans from this bag are white.(BK)These beans (oddly) are white. (observation)These beans are from this bag.(diagnosis)
Logic-based approaches
multiple sets of assumptionsintegrity constraints
Probabilistic-based approaches
multiple explanations withprobability (uncertainty)
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Problems
Logic-based
too many logical explanations
Probabilistic-based
independent variables and unstructured data
Some solutions
Probabilistic Horn Abduction and Bayesian Networks (PHA) [Poole, 1993]
Bayesian Abductive Logic Programs: A Probabilistic Logic for Abductive Reasoning (BALP)[Raghavan, 2011]
Probabilistic Abduction using Markov Logic Networks (MLN) [Kate and Mooney, 2009]
Abduction with stochastic logic programs based on a possible worlds semantics [Arvanitiset al., 2006]
Implementing Probabilistic Abductive Logic Programming with Constraint Handling Rules[Christiansen, 2008]
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Preliminaries: Abductive Logic Programming (ALP)
Abductive Logic Program T = 〈P,A, I〉 [Kakas and Mancarella, 1990]
P is a standard logic program
A (Abducibles) is a set of predicate names
IC (Integrity Constraints or domain-specific properties)
Problem formulation
Given an observation O and a theory T = 〈P,A, I〉
Find an abductive explanation ∆ s.t. P ∪∆ |= O (∆ explains O) and P ∪∆ |= IC (∆ isconsistent).
T abductively entails G (T |=A O).
Abductive Logic Programming [Kakas and Mancarella, 1990]
extends Logic Programming: some predicates (abducibles) incompletely defined
deriving hypotheses on these abducible predicates (abductive hypotheses)
Goal: observations to be explained
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Preliminaries: Abductive Logic Programming (ALP)
Abductive Logic proof procedure [Kakas and Riguzzi, 2000]
Two phases abductive (A) and consistency derivations (B)
(A) is the standard Logic derivation extended in order to consider abducibles
when an atom δ has to be proved, it is added to the current set of assumptions
the addition of δ must not violate any integrity constraint
(B) starts to check that all integrity constraints containing δ fails
(B) calls (A) to solve each goal
Considerations
there are constraints that prevent an abduction?
constraints verification involves:
facts deductively verified→ true
hypotheses→ evaluating all possible explanations
constraints: classical vs typed and crisp vs soft?
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic Abductive Logic Programming (PALP)
A new approach using Possible Worlds
each time one assumes something he hypothesizes that situation in a specific world
each abductive explanation can be seen as a possible world
likelihood assessed considering what we have seen and what we should expect to see
typed probabilistic constraints:
personal belief in the likelihood of whole constraint{nand, or, xor}-constraints
Classical vs Probabilistic ALP
ALP
looks for the minimal explanationhandles crisp nand-constraint
PALP
looks for the most probable explanationhandles probabilistic typed constraint 〈Prob, Literals, Type〉:Prob = [0, 1] , Type = {nand, or , xor}, Literals = l1, ...., ln
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic Abductive Logic Programming (PALP)
New probabilistic proof procedure
Two perspectives:
Logical
exploits ALP to generate many logical explanationsextends ALP to handle typed constraints
Probabilistic
rank all explanations according to their chance of being true
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Logical perspective
New Logical Proof Procedure
extends Abductive and Consistency Derivation:
Classical: when an atom δ has to be proved, it is added to the current set ofassumptionNew: when an atom δ has to be proved, two sets of assumptions areconsidered: one where it holds and another where it does not.
extends Consistency Derivation:
integrity checking on constraints NAND,OR,XORNAND satisfied when: at least one condition is falseOR satisfied when: at least one condition is trueXOR satisfied when: only one condition is true
each conclusion is a possible consistent world
New Approach ∼ Classical + (new rules and backtracking on each choice point)
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Logical perspective
Example (Observation o1, Query and Possible Explanations)
P : {printable(X ) ← a4(X ), text(X )} ∪ a4(o1)
A = {image, text, black_white, printable, table, a4, a5, a3}
I = {ic2, ic3, ic4}
ic2 = 〈0.9, [table(X ), text(X ), image(X )], or〉
ic3 = 〈0.3, [text(X ), color(X )], nand〉
ic4 = 〈0.3, [table(X ), color(X )], nand〉
?- printable(o1)
printable(o1)← a4(o1), text(o1)
∆1 = {text(o1), table(o1)}
∆2 = {text(o1), table(o1), image(o1)}
text(o1)
table(o1)
.
table(o1)
image(o1)
.
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic perspective
The chance of being true of a ground literal δj (1).The unnormalized probability of the abductive explanation (2).
P(δj ) =n(δj )
n(cons)!(n(cons)−a(δj))!
(1) P′(∆i ,Ici )
=J∏
j=1
P(δj ) ∗K∏
k=1
P(ick ) (2)
The probability of δj is equal to 1− P(δj ).
∆ = {P1 : (∆1, Ic1), ...,PT : (∆T , IcT )}, T consistent possible worlds for goal G
∆i = {δ1, ..., δJ}, the ground literals δj abduced in an abductive proof
Ici = {ic1, ..., icK } is the set of the constraints involved in ∆i
n(δj ) true groundings of the predicate used in literal δj
n(cons) is total number of constants encountered in the world
a(δj ) is the arity of literal δj
P(ick ) is the probability of the kth-constraint.
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic perspective
Example (Compute explanations probability )
P′(∆1,Ic1)
= P(text(o1)) ∗ P(table(o1)) ∗ P(ic2) ∗ P(ic3) ∗ P(ic4)
P′(∆1,Ic1)
= 0.6 ∗ 0.1 ∗ 0.9 ∗ 0.3 ∗ 0.3 = 0.00486
Example (Probability assessment of the Abductive Explanations)
A = {0.2:image, 0.4:text, 0.1:black_white, 0.6:printable, 0.1:table,0.9:a4, 0.1:a5, 0.1:a3}
P′(∆1, Ic1) = 0.00486P′(∆2, Ic2) = 0.00875
P′(printable(o1)) = max1≤i≤T P′i: (∆i , Ici ) = P′(∆2, Ic2) = 0.00875
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Improving Classification Exploiting Probabilistic Abductive Reasoning
Exploiting our probabilistic abductive logic proof procedure
learns the model (i.e. the Abductive Logic Program < P,A, IC >) and theparameters (i.e. literals probabilities)
classify never-seen instances
Solution: A new system for classification tasks
given a Training set and a abducibles set A (possibly empty), it learns:
the corresponding theory T by INTHELEX [Esposito et al., 2000]the integrity constraints nand, xor by [Ferilli et al., 2005]
given a Test set, tries to cover the example considering both as positive and asnegative for the class c
< P_max(c, e),∆p >← probabilistic_abductive_proof (ProbLiti , c, e)< P_max(¬c, e),∆n >← probabilistic_abductive_proof (ProbLiti ,¬c, e)
compute the higher between them
selects the best classification between all concepts
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Experimental Settings
Goal:
assessing the quality of the results in presence of incomplete and noisy data
comparing with deductive-reasoning with increasing levels of data corruption
Methodology:
10-fold split to obtain < Train, Test >
replace each test-set by corrupted versions:
removed at random K% of each example (K varying from 10% to 70% with step 10)5 runs to randomize (35 test-sets for each fold)
assume learned constraints true with probability 1.0 (no prev. knowledge)
Dataset:
Breast-Cancer
Congressional Voting Records
Tic-Tac-Toe
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Results and Discussion
Breast-Cancer (#Pos = 201; #Neg: 85)
Each instance: 9 literalsTheory: 30 clauses; 6 lits/clauseLearned IC: 1784 nand-constraints(55% -> 4, 35% -> 3 and 10% -> 2);9 type-domain
Congressional Voting Records(#Republicans = 267; #Democrats: 168)
Each instance: 16 literalsTheory: 35 clauses; 4.5 lits/clauseLearned IC: 4173 nand-constraints(16% -> 4, 37% -> 3 and 47% -> 2);16 type-domain
Tic-Tac-Toe (#Pos = 626; #Neg: 332)
Each instance: 8 literalsTheory: 18 clauses; 4 lits/clauseLearned IC: 1863 nand-constraints(99% -> 4, 1% -> 3); 16 type-domain
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.70.
30.
40.
50.
60.
70.
80.
91.
0
Corruption
Acc
urac
y
Breast CancerCongressTic Tac Toe
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Results and Discussion
Dataset Corr.Abductive Reas. Deductive Reas.
Prec. Rec. F1 Prec. Rec. F1
Breast
0% 0.891 0.870 0.881 0.891 0.870 0.88110% 0.865 0.835 0.850 0.634 0.454 0.22720% 0.853 0.411 0.556 0.571 0.118 0.19530% 0.800 0.188 0.584 0.500 0.029 0.05640% 1.000 0.059 0.111 —– —– —–50% 1.000 0.035 0.068 —– —– —–60% 1.000 0.023 0.046 —– —– —–70% 1.000 0.012 0.023 —– —– —–
Congress
0% 1.000 0.961 0.980 1.000 0.961 0.98010% 1.000 0.961 0.981 0.971 0.793 0.87320% 1.000 0.769 0.869 0.971 0.761 0.85330% 1.000 0.680 0.809 0.982 0.714 0.82740% 1.000 0.538 0.700 0.979 0.623 0.76150% 1.000 0.500 0.667 1.000 0.425 0.59660% 1.000 0.346 0.514 1.000 0.333 0.50070% 1.000 0.269 0.424 1.000 0.264 0.418
TikTakToe
0% 1.000 0.983 0.992 1.000 0.983 0.99210% 1.000 0.833 0.909 0.842 0.743 0.78920% 1.000 0.730 0.844 0.808 0.531 0.64130% 1.000 0.508 0.673 0.796 0.387 0.52140% 1.000 0.302 0.463 0.829 0.261 0.39750% 1.000 0.127 0.225 0.697 0.103 0.18060% 1.000 0.048 0.090 0.777 0.031 0.06070% 1.000 0.016 0.031 1.000 0.004 0.009Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Probabilistic Abductive Logic Approach
Reasoning in complex contexts→ deduction is not enough.
Abduction might help→ it should be logical + probabilistic.
Our approach:
Abductive Logic Programming→ generates multiple explanations;Probabilistic assessment of each explanation.
Our strategy to classification works correctly in presence of noisy and corruption.
Current and Future works
Learning the probabilistic constraints.
Enriching the probabilistic model of literal distribution.
Test our procedure on other tasks such as: NLU and plan recognition.
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
Thanksfor
attention
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
References I
A. Arvanitis, S. H. Muggleton, J. Chen, and H. Watanabe. Abduction with stochasticlogic programs based on a possible worlds semantics. In In Short Paper Proc. of16th ILP, 2006.
H. Christiansen. Implementing probabilistic abductive logic programming withconstraint handling rules. In T. Schrijvers and T. Frà 1
4 hwirth, editors, ConstraintHandling Rules, volume 5388 of Lecture Notes in Computer Science, pages85–118. Springer Berlin Heidelberg, 2008. ISBN 978-3-540-92242-1. doi:10.1007/978-3-540-92243-8_5. URLhttp://dx.doi.org/10.1007/978-3-540-92243-8_5.
F. Esposito, G. Semeraro, N. Fanizzi, and S. Ferilli. Multistrategy theory revision:Induction and abduction in inthelex. Machine Learning, 38:133–156, 2000. ISSN0885-6125. doi: 10.1023/A:1007638124237. URLhttp://dx.doi.org/10.1023/A%3A1007638124237.
S. Ferilli, T. M. A. Basile, N. Di Mauro, and F. Esposito. Automatic induction ofabduction and abstraction theories from observations. In Proc. of the 15th ILP,ILP’05, pages 103–120, Berlin, Heidelberg, 2005. Springer-Verlag. ISBN3-540-28177-0, 978-3-540-28177-1. doi: 10.1007/11536314_7. URLhttp://dx.doi.org/10.1007/11536314_7.
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds
Introduction Probabilistic Abductive Logic Programming Experimental Evaluation Conclusions References
References II
L. C. Getoor. Learning statistical models from relational data. PhD thesis, Stanford,CA, USA, 2002. AAI3038093.
A. C. Kakas and P. Mancarella. Generalized stable models: A semantics for abduction.In ECAI, pages 385–391, 1990.
A. C. Kakas and F. Riguzzi. Abductive concept learning. New Generation Comput., 18(3):243–294, 2000.
R. J. Kate and R. J. Mooney. Probabilistic abduction using markov logic networks. InProceedings of the IJCAI-09 Workshop on Plan, Activity, and Intent Recognition(PAIR-09), Pasadena, CA, July 2009. URLhttp://www.cs.utexas.edu/users/ai-lab/?kate:pair09.
D. Poole. Probabilistic horn abduction and bayesian networks. Artif. Intell., 64(1):81–129, 1993.
L. D. Raedt and K. Kersting. Probabilistic inductive logic programming. In ALT, pages19–36, 2004.
S. V. Raghavan. Bayesian abductive logic programs: A probabilistic logic for abductivereasoning. In T. Walsh, editor, IJCAI, pages 2840–2841. IJCAI/AAAI, 2011. ISBN978-1-57735-516-8. URLhttp://dblp.uni-trier.de/db/conf/ijcai/ijcai2011.html#Raghavan11.
Fulvio Rotella and Stefano Ferilli DIB, CILA
Probabilistic Abductive Logic Programming using Possible Worlds