Upload
dj-dimple-jones
View
222
Download
0
Embed Size (px)
Citation preview
8/8/2019 Day 05 Session
1/26
Project Metrics
- Sanket Shah
8/8/2019 Day 05 Session
2/26
What is it?
Quantitative measures helping
software people to find efficacy of
software process and other
projects using same process.
Can also be used to pinpoint
Problem Areas.
8/8/2019 Day 05 Session
3/26
Reasons for Metrics
To Characterize
To Evaluate
To Predict
To Improve
8/8/2019 Day 05 Session
4/26
Definitions
Measure:
When single data point is collected
(errors uncovered per module)
Measurement:
Collection of one or more data
points (errors uncovered for each
module for large set of modules)
8/8/2019 Day 05 Session
5/26
Definitions
Metric:
Relating individual measures in
some way.
Indicator:
Metric or combination of metrics
providing insight into software
process, software project or theproduct itself.
8/8/2019 Day 05 Session
6/26
Definitions
Process Indicators:
Enable Software Engineering
Organization to gain insight into the
efficacy of an existing progress. They
help to assess what works and what
doesnt.
8/8/2019 Day 05 Session
7/26
Definitions
Project Indicators:
Assess status of ongoing project
Track potential risks Uncover problem areas before they
go Critical .
Adjust work flow or tasks.
Evaluate the project teams ability to
control quality of software work
products.
8/8/2019 Day 05 Session
8/26
Definitions
Personal Software Process:
Structured set of process
descriptions, measurements and
methods that can help them
estimate and plan their work.
Uses forms, scripts and standards.
8/8/2019 Day 05 Session
9/26
Metrics Guidelines
Use common sense and organizational
sensitivity when interpreting metrics data.
Provide regular feedback to the individuals
and teams who have worked to collect
measures and metrics.
Dont use metrics to appraise individuals.
Work with practitioners and teams to set clear
goals and metrics that will be used to achieve
them.
8/8/2019 Day 05 Session
10/26
Metrics Guidelines
Never use metrics to threaten individuals or
teams.
Metrics data that indicate a problem area
should not be considered negative. These
data are merely an indicator for process
improvement. Dont obsess on a single metric to the
exclusion of other important metrics.
8/8/2019 Day 05 Session
11/26
Failure Analysis Steps
Categorize by origin.
Recording cost to correct each
error and defect. Count errors and defects in each
category in descending order.
Compute overall cost of errors anddefects in each category.
Resultant data are analyzed to
uncover the categories that result
in hi hest cost to the or anization.
8/8/2019 Day 05 Session
12/26
Failure Analysis Steps
Analyze resultant data to uncover
categories resulting in highest cost
Develop Plans to modify process
with intent of eliminating (or
reducing) the class of errors and
defects that is most costly.
8/8/2019 Day 05 Session
13/26
Project Metrics Measurements
Inputs measures ofResources
Outputs Measures ofDeliverables
Results Effectiveness ofDeliverables
8/8/2019 Day 05 Session
14/26
Software Measure Types
Direct (Quantity)
Cost and Effort Applied
LOC, Execution Code Indirect (Quality)
Functionality, Complexity, Efficiency,
Reliability etc
8/8/2019 Day 05 Session
15/26
Product Metrics
Quality ofDeliverables
Measures of Analysis Models
Complexity of the design
Internal algorithmic complexity, architectural
complexity, data flow complexity
Code Measures
Process Effectiveness
Defect Removal Efficiency
8/8/2019 Day 05 Session
16/26
Process Metrics - Strategic
Majority focus on quality achieved
as a consequence of a repeatable
or managed process
Statistical SQA data
error categorization & analysis
Defect removal efficiency
propagation from phase to phase
Reuse data
8/8/2019 Day 05 Session
17/26
Project Metrics Tactical
Effort/time per SE task
Errors uncovered per review hour
Scheduled vs. actual milestone dates
Changes (number) and their characteristics
Distribution of effort on SE tasks
8/8/2019 Day 05 Session
18/26
Size Oriented Metrics
Errors per KLOC
Defects per KLOC
Amount spent per LOC
Documentation size per KLOC
Errors / person-month
LOC per person-month Cost (Currency) / page of
documentation
8/8/2019 Day 05 Session
19/26
Function Oriented Metrics
Errors per FP
Defects per FP
Cost (Currency) per FP
Pages of documentation per FP
FP per person-month
8/8/2019 Day 05 Session
20/26
Computing Function Points
Analyze informationdomain of theapplicationand develop counts
Weight each count byassessing complexity
Assess influence ofglobal factors that affectthe application
Computefunction points
Establish count for input domain andsystem interfaces
Assign level of complexity or weightto each count
Grade significance of external factors, F
such as reuse, concurrency, OS, ...
degree of influence: N = Fi
complexity multiplier: C = (0.65 + 0.01 x N)
function points = (count x weight) x C
where:
i
8/8/2019 Day 05 Session
21/26
Information Domain Analysis
compl it multipli
functionpoint
numberofuserinputs
numberofuseroutputs
numberofuserinquiries
numberof files
numberofext.interf ces
measurement parameter
3
4
3
7
countei ting factor
simple avg. complex
4
4
7
7
count-total
X
X
X
X
X
8/8/2019 Day 05 Session
22/26
Accounting Complexity
Fact arerated a cale f t i rtantt ery i rtant :
datacommunicationsdistri uted functions
eavilyusedconfi urationtransactionrateon-linedataentryenduserefficiency
on-lineupdatecomplexprocessing
installationeaseoperational easemultiplesitesfacilitatechange
8/8/2019 Day 05 Session
23/26
Reasons for selecting FP
Independent of Programming Language
Uses Readily countable characteristics of the
Information Domain of the problem
Does not penalize inventive
implementations that require fewer LOC than
others Makes easier to accommodate reuse and the
trend towards Object Oriented Approaches
8/8/2019 Day 05 Session
24/26
Measuring Quality
Correctness the degree to
which a program operates
according to specification
Maintainabilitythe degree to
which a program is amenable to
change
8/8/2019 Day 05 Session
25/26
Measuring Quality
Integritythe degree to which a
program is impervious to outside
attack
Usabilitythe degree to which a
program is easy to use
8/8/2019 Day 05 Session
26/26
Defect Removal Efficiency
DRE = (errors) / (errors + Defects)