Upload
claude-shields
View
214
Download
0
Embed Size (px)
Citation preview
Department of Electrical and Computer EngineeringUniversity of Virginia
1
Software Quality & Safety Assessment Using Bayesian
Belief Networks
Joanne Bechta DuganSusan Donohue, Ganesh Pai
University of Virginia
Department of Electrical and Computer EngineeringUniversity of Virginia
2
Problems Under Consideration
• GETR: How does one decide that a software system is “good enough to release”?
• SWQ-BBN: Can I combine process assessment and product assessment metrics to predict quality/reliability of a software system?
Department of Electrical and Computer EngineeringUniversity of Virginia
3
Approach: Bayesian Belief Networks (BBN)
• We use BBN models as the basis of both projects
• BBN models effectively allow the combination of quantitative and qualitative assessment (that is, measures and expert judgment) in the same model
Department of Electrical and Computer EngineeringUniversity of Virginia
4
GETR Approach (with S. Donohue)
• For the GETR (Good Enough to Release) project, we are developing a BBN model of the decision process– What evidence is used, and how is it weighed– Determining conditional probabilities from expert
opinion (to get probability parameters for the model)
• GETR is building a mathematical framework based on BBN to understand and facilitate the decision making process
Department of Electrical and Computer EngineeringUniversity of Virginia
5
GETR Decision
How can we investigate and document the decision process that is used to go from...
to…
I have anacceptablelevel of beliefthat the systemwill operateas specified.
QualityAssuranc
e
Test Results Personal andTeam CMM
PrototypePerformance
RequirementsReview
Is the system good enough to release?
Code InspectionRisk Assessment
Formal Methods
for a computer-based system
Engineering Judgment
Department of Electrical and Computer EngineeringUniversity of Virginia
6
GETR
ProcessEvidence
Developers'Abilities Artifact
Quality
ProcessQuality
QAReport
ProjectDocumentation
ValidationActivities
Prototype RequirementsReview
Scenarios FormalMethods
VerificationActivities
Testing
TestResults
Testability
CodeInspection
Robustness
DomainFit
ErrorDetectability
ProblemReport Status
ProductEvidence
ProductQuality
DesignPurpose Met
Usability
Functionality
UserAcceptance
Department of Electrical and Computer EngineeringUniversity of Virginia
7
Quantifying Judgment for BBN
??
given Parent 1
given Parent 2
given Parent 3 Acc Unacc Unk
Acceptable
Unacceptable
Unknown
Acceptable
Acceptable
How muchinformation isprovided by
testing ?
How muchinformation is
provided by bothtesting andinspectionstogether?
Cond.NPT
Entries
SupportingEvidenceFunction
NPT ConversionFunction
Parent KnowledgeContribution
Function
Know-ledge
Weights
Inter-action
Weights
!
Department of Electrical and Computer EngineeringUniversity of Virginia
8
Quantifying Judgment for BBN (QJ BBN)
Conditional probabilities (NPT entries) are generated as a function of the contribution of evidence to support a premise. For example,
•Acceptable results from testing supports the conclusion that verification is acceptable.•Unacceptable documentation supports the premise that the artifact quality is unacceptable.
•Evidence can overlap, be disjoint or synergistic.•Proofs of coherence of functions used in QJ methodology help assure rational decisions.
•Importance and sensitivity analysis can help guide decision makers in seeking new evidence.•BN model provides a record of evidence analysis.
Department of Electrical and Computer EngineeringUniversity of Virginia
9
Application to NASA: Seal of Approval Process (SOAP) for PRA tools
What evidence is available
for review?
Is the tool “fit for use”?
Is it“good enough”
to share?
Is the tool appropriate for use in a given domain?
How much influencedoes certain evidencehave on the approval
process?
Department of Electrical and Computer EngineeringUniversity of Virginia
10
PRATMAD Committeereceives a tool for
review
Fitbetween at least1 domain and
tool?
Step 1Determine applicabledomain(s) for tool
Decision SupportDomain Selection
Tool
Step 2Determine tool's"fitness for use"
Tool is rejected
Decision SupportFitness for Use
Worksheet
PRATMAD Committeeapproves tool for usein specified domain
More evidence iselicited (return to 2)OR tool is rejected
Tool fit to use?
YES
NO
YES
NO
x 0 10 -- 25 40 -- 60 75 -- 90 95 -- 100
Category Importance
Weight
Evidence Category / Specific Evidence
Rating Not
Available / Applicable
Not Acceptable
Marginally Acceptable
Moderately Acceptable Acceptable
Highly Acceptable
3.26Degree of Fit With Proposed Project or System Evidence TBD
2.03Applicable Domain(s) Clearly Specified
Evidence TBD
1 RobustnessEvidence TBD
Fitness for Use Score
Acceptability Weight
--
Department of Electrical and Computer EngineeringUniversity of Virginia
11
SWQ BBN Approach (with G. Pai)
• For the SWQ BBN project, we are developing techniques to build a BN to model the software development process and the products (artifacts) – BBN model represents causally related phases and activities
within the phases.– Measurements or expert opinion can be used to determine
probability parameters for the model.
• Model can be used to assess the process/product with respect to reliability (defect density) or other quality attribute
Department of Electrical and Computer EngineeringUniversity of Virginia
12
Evaluate
Evidence
SpecificationPhase BN
CodingPhase BN
TestingPhase BN
DesignPhase BN
CausalRelations
BN model of SDP
DefectContent
&ReliabilityEstimates
In - processfeedback
Software metricsProcess
measurements
Observe and Fix
Unified AnalyisFramework
Existing Softwaredevelopment
process
System ReliabilityAnalysis
Estimates
Failure
SW-1SW-2
System-levelFault Tree
Department of Electrical and Computer EngineeringUniversity of Virginia
13
Candidate BBN for design phase
Design Quality
Spec Quality
Product (D)
Consistence(D)
Detection (D)
DesignerExpertise
Process (D)
Defects
Delta Design
InspectionQuality Verification
Traditional metrics
- LOC- V(G)- Fan-In/ Fan-Out- SigFF
Object orientedmeasures
- CK Metrics suite (CBO, DIT, WMC, DFC)
- Cohesion- Encapsulation
Complexity
Department of Electrical and Computer EngineeringUniversity of Virginia
14
Hypothetical illustrative example
• Hypothetical priors
• Model result– Medium defect
content– Actual values
dependenton the mappingbetween node statesand range values
• E.g.Vlow, Low, Medium, High,
Vhigh 0-20, 20-40, 40-60, 60-80, 80-100
– Model results Defect content would lie in 40 – 60 range
Posterior
Evidence
Department of Electrical and Computer EngineeringUniversity of Virginia
15
• Feedback to the designer greater value– Network itself can
provide feedback– Propagation of
evidence• In this case:
knowledge of high specification quality, observation of high defect content
– Change in distribution indicates potential problem area
Department of Electrical and Computer EngineeringUniversity of Virginia
16
Application to IV&V (joint work with Titan (Khalid Lateef))
Use IVV process for use case analysis, construct BBN from process model
• Relevant process parameters and inputs represent parent nodes
• Child nodes of BBN represent features desired from the requirements specification
COMPLEXITYUCP
Metric
#Interactions
Understandability
CLARITY
Scope
BoundaryDomain /Context
TRACEABLE
Glossary ofterms
Data dictionary
CONSTRAINTS
SWFootprint
CPU cycles
RAM
Department of Electrical and Computer EngineeringUniversity of Virginia
17
CONSISTENT
Formalism_Consistent
ScenarioWalkthrough
CORRECT
COMPLETE
ScenarioCompletenessException
Scenario
RegularScenario
Edge GuardCondition
ActorIdentification
PostCondition
PreCondition
Use CaseNames
Use CaseModeling Rules
DesignNotation
UserAgrees
NodeCoverage
EdgeCoverage
Expected result/OP matches
P&PC
Department of Electrical and Computer EngineeringUniversity of Virginia
18
Example analysis
• Probabilities reflect either measurement or analysts’ beliefs• The state ‘true’ is less than 95% Not mature enough.
Department of Electrical and Computer EngineeringUniversity of Virginia
19
Technology Readiness Level
Department of Electrical and Computer EngineeringUniversity of Virginia
20
Data / Case Study Availability
GETR case study domain: “lightweight” V&V for in-house developed analytical tools being considered for release to other centers or research groups.
Identified case studies – RAP (JPL), SIAT (IV&V), and MATT (IV&V)
SWQ BBN case study domain: Case study of system development, including artifacts & defect data. OO or ODC would be great
Working with Khalid Lateef to develop case study for OO requirements analysis
Department of Electrical and Computer EngineeringUniversity of Virginia
21
Barriers to Research or Application
Case studies