26
Expert Systems

Expert Systems. Expert System Functionality replace human expert decision making when not available assist human expert when integrating various decisions

Embed Size (px)

Citation preview

Expert Systems

Expert System Functionality

• replace human expert decision making when not available• assist human expert when integrating various decisions• provides an ES user with

– an appropriate hypothesis

– methodology for knowledge storage and reuse• border field to Knowledge Based Systems, Knowledge

Management• knowledge intensive × connectionist • expert system – software systems simulating expert-like

decision making while keeping knowledge separate from the reasoning mechanism

Expert Systems Classification

• Unlike classical problem solver (GPS, Theorist) Expert Systems are weak, less general, very case specific

• Exert systems classification:– Interpretation– Prediction– Diagnostic– Design & Configuration– Planning

– Monitoring– Repair &

Debugging– Instruction– Control

Underlying Philosophy

• knowledge representation – production rules– logic – semantic networks– frames, scripts, objects

• reasoning mechanism

– knowledge-oriented reasoning– model-based reasoning– case-based reasonig

inference engine

world model

knowledge base

user

Expert System Architecture

knowledge base editor

preceptors

explanation subsystem

explanation subsystem

Rule-Based System

• knowledge in the form of if condition then effect (production) rules

• reasoning algorithm:(i) FR detect(WM)(ii) R select(FR)(iii) WM apply R(iv) goto (i)

• conflicts in FR:– first, last recently used, minimal WM change, priorities

• incomplete WM – querying ES (art of logical and sensible querying)

• examples – CLIPS (OPS/5), Prolog

Rule-Based System Example

here finenot here absentabsent and not seen at homeabsent and seen in the buildingin the building fineat home and not holiday sickhere and holiday sick

not here, in the building fine

not here, not holiday sick

? here no? seen no? holiday nosick

? here yesfine

? here yes? holiday yessick

Data-driven × Goal-driven

here seen holiday

absent

buildinghome

fine sick

data driven

goal driven

Data-driven × Goal-driven

• goal driven (backward chaining) ~ blood diagnostic, theorem proving– limited number of goal hypothesis– data shall be acquired, complicated data about the object– less operators to start with at the goal rather than at the

data• data driven (forward chaining) ~ configuration, interpretation,

– reasonable set of input data– data are given at the initial state– huge set of possible hypothesis

• taxonomy of rules, meta-rules, priorities, …

Knowledge Representation in ES

• Shallow Knowledge Models

– rules, frames, logic, networks

– first generation expert systems• Deep Knowledge Models

– describes complete systems causality

– second generation expert systems• Case Knowledge Models

– specifies precedent in past decision making

Model Based Reasoning

• Sometimes it is either impossible or imprecise to describe the domain in terms of rules …

• Here we use a predictive computational model of the domain object in order to represent more theoretical deep knowledge model

• Model is based either on – quantitative reasoning (differential equations, …)– qualitative reasoning (emphasizes some

properties while ignoring other)• Very much used for model diagnosis and intelligent

tutoring

Qualitative Reasoning

• Qualitative Reasoning is based on symbolic computation aimed at modeling of behavior of physical systems– commonsense inference mechanisms– partial, incomplete or uncertain information– simple, tractable computation– declarative knowledge

• QR Techniques:

– Constrain based – Qualitative Simulation QSIM– Component based – Envision – Process based – QPT (Qualitative Process Theory)

Case Based Reasoning

• part of the machine learning lecture• Algorithms:

– problem attributes description– retrieval of previous case– solution modification– testing new solution – repairing failure or inclusion into the plan library

• Utilized widely in law domain (Judge)

Uncertainty in Expert Systems

from correct premises and correct sound rules correct conclusions

• but sometimes we have to manage uncertain information, encode uncertain pieces of knowledge, model parallel firing of inference rules, tackle ambiguity

• There is a number of various models of uncertain reasoning: – Bayesian Reasoning – classical statistical approach– Dempster-Shafer Theory of Evidence– Stanford Certainty Algebra – MYCIN

Bayesian Reasoning

……. given that a and b are independent

……. given that a depends on b

- prior probability (unconditional) … p(hypothesis)

- posterior probability (conditional)… p(hypothesis|evidence)

Prospector, Dice examples

)e(P)h|e(P)h(P

)e(P)eh(P

)e|h(P

)b(P)a(P)ba(P

)b(P)b|a(P)ba(P

P(e) P(h)P(e|h)

)h|e(P

jjj )h|e(P)h(P

)h|e(P)h(P)e|h(P

Bayesian Reasoning – cont’

• we introduce Odds - O(h)

• we introduce sufficiency measure

• we introduce join Odds:

)h(P1)h(P

)h(P

)h(P)h(O

)e|h(P1)e|h(P

)e|h(O

)h(O)e|h(P1

)h(P1)h(P)e|h(P

)e|h(O

)e|h(P1)h(P1

)h(P)e|h(P

e3

he2

e1

3

2

1

)h(O,...,,)e,...,e,e|h(O n21n21

Stanford Certainty Algebra

• heuristic (expert given) approach for reasoning with uncertainty

• let us introduce

– measure of belief MB(h|e)

– measure of disbelief MD(h|e)

– certainty factor CF(h|e)

1>MB(h|e)>0 if MD(h|e)=0

or

1>MD(h|e)>0 if MB(h|e)=0

CF(h|e) = MB(h|e)-MD(h|e)

)h(P],max[)h(P)]h(P),e|h(Pmax[

01

)e,h(MB1 if P(h|e) = 1

otherwise)h(P],max[)h(P)]h(P),e|h(Pmax[

01 )h(P],max[)h(P)]h(P),e|h(Pmax[

01

• SCA characteristics:– certainly true – P(h|e)=1 => MB=1, MD=0, CF=1

– certainly false – P( |e)=1 => MB=0, MD=1, CF=-1

– lack of evidence – P(h|e)= P(h) => MB=0, MD=0, CF=0

• Combination of evidence:– CF(e1 and e2) = min(CF(e1),CF(e2))

– CF(e1 or e2) = max(CF(e1),CF(e2))

• Implication: if e then h

– CF(h,e) = CF(e).CF(h,E), (where CF(h,E) is for CF(e)=1)

Stanford Certainty Algebra – cont’

h

Stanford Certainty Algebra – cont’

if the stain of the organism is gram positive and the morphology of the organism is coccus and the growth conformation of the organism is chains

then there is suggestive evidence (CF(h,E)=0.7) that the identity of the organism is streptococcus

CF(e)=CF( ) CF(e)= min[CF(e1),CF(e2),CF(e3)]

CF(e)=min[0.5,0.6,0.3]CF(e)=0.3

321 EEE CF(h,e)=CF(e),CF(h,E)CF(h,e)=0.3 × 0.7CF(h,e)=0.21

Dempster-Shafer Theory of Evidence

)(PX

1)X(mΘ

)(PX

1)X(mΘ

• frame of discernment – a space of possible events/answers/options ={airliner,bomber,fighter}

={red,green,blue,orange,yellow}

={barn,grass,person,cow,car}

• is exclusive, probability of the right answer in is 1• basic probability assignment m(E), degree of belief of evidence

example – what was detected is 70% hostile (the only information)

m({b,f})=0.7, m()=0.3m({b})=m({f})=m({b,a})=m({f, a})=0

DST – Combining of Evidence

• Dempster Rule of Combination – Orthogonal Sum

m1({b,f})=0.7, m1()=0.3

m2({b})=0.9, m2()=0.1

m1m2({b,f})=m1({b,f})×m2()=0.7 ×0.1=0.07

m1m2({b})=m1({b,f})×m2({b})

+m1()×m2({b})=0.7×0.9+0.3×0.9 =0.63+0.07+0.27=0.97

ZYX

2121 )Y(m).X(m)Z(mm

DST – Total Belief/Plausibility

• in contrast to local belief in the set - m(E), let us introduce total belief set Bel(E), minimum belief based on given evidence

• in contrast there is plausibility – maximum plausible belief assigned to the set E

Bel1({b,f})=m1({b, f})+m1({b}) +m1({f})=0.7+0+0=0.7

Bel1 Bel2({b,f})= m1m2({b, f})+ m1m2({b}) +

m1m2({f})=0.07+0.9+0=0.97

ET

)T(m)E(Bel

ET

)T(m1)E(Bel1)E(Pls

Fuzzy Logic

• Another way of handling incomplete knowledge• Precision/vagueness is expressed by membership

function to a set

mF(20,adult)=0.6, mF(20,young)=0.4, mF(20,old)=0

adult pensioneryoung

Fuzzy Logic – cont’

• Fuzzy Logic is not concerned how these distribution are created but how they are manipulated. There are many interpretation, similar to Stanford Certainty AlgebramF(20,adult and young)=0.4, mF(20,adult or young)=0.6

• comparison to previous approaches:

vaguenessrandomness

possibilityprobability

inexact reasoninguncertain reasoning

fuzzy setsclassical approaches

Expert Systems in Practice

• MYCIN – example of medical expert system– old well known reference– great use of Stanford Certainty Algebra– problems with legal liability and knowledge

acquisition• Prospector

– geological system– knowledge encoded in semantic networks– Bayesian model of uncertainty handling– saved much money

Expert Systems in Practice – cont’

• XCON/R1– classical rule-based system– configuration DEC computer systems– commercial application, well used, followed by XSEL,

XSITE– failed operating after 1700 rules in the knowledge base

• FelExpert– rule-based, bayesian model, – taxonomised, used in a number of applications

• ICON– configuration expert system– uses proof planning structure of methods