41
Prof. Mohamed Batouche [email protected]

Prof. Mohamed Batouche [email protected]. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Embed Size (px)

Citation preview

Page 1: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Prof. Mohamed Batouche

[email protected]

Page 2: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

“If you can’t measure it,you can’t manage it”

Tom DeMarco, 1982

2

Page 3: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

• Process Measure the efficacy of processes. What works, what doesn't.

• Project Assess the status of projects. Track risk. Identify problem areas. Adjust work flow.

• ProductMeasure predefined product attributes (generally related to ISO9126 Software Characteristics)

3

Page 4: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Other classifications

4

Page 5: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Classification by phases of software system    • Process metrics – metrics related to the software

development process• Product metrics – metrics related to software maintenance

Classification by subjects of measurements• Quality• Timetable• Effectiveness (of error removal and maintenance

services)• Productivity

5

Page 6: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

1. Facilitate management control, planning and managerial intervention.

Based on:         ·   Deviations of actual from planned performance.        ·   Deviations of actual timetable and budget

performance from planned.

2. Identify situations for development or maintenance process improvement (preventive or corrective actions).

Based on:        ·  Accumulation of metrics information regarding

the performance of teams, units, etc.

6

Page 7: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

• KLOC — classic metric that measures the size of software by thousands of code lines.

• Number of function points (NFP) — a measure of the development resources (human resources) required to develop a program, based on the functionality specified for the software system.

7

Page 8: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

8

Calculation of NCE Calculation of WCE

Error severity class Number of Errors Relative Weight Weighted Errors

a b c D = b x c

low severity 42 1 42

medium severity 17 3 51

high severity 11 9 99

Total 70 --- 192

       

NCE 70 --- ---

WCE --- 192

Page 9: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Process Metrics

9

Page 10: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Software process quality metrics◦ Error density metrics◦ Error severity metrics

Software process timetable metrics

Software process error removal effectiveness metrics

Software process productivity metrics

10

Page 11: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

11

Code Name Calculation formula

CED Code Error Density NCECED = ----------- KLOC

DED Development Error Density NDEDED = ----------- KLOC

WCED Weighted Code Error Density WCEWCDE = --------- KLOC

WDED Weighted Development Error Density WDEWDED = --------- KLOC

WCEF Weighted Code Errors per Function Point

WCEWCEF = ----------

NFP

WDEF Weighted Development Errors per Function Point

WDEWDEF = ----------

NFP

NCE = The number of code errors detected by code inspections and testing.NDE = total number of development (design and code) errors) detected in the development process.WCE = weighted total code errors detected by code inspections and testing.WDE = total weighted development (design and code) errors detected in development process.

Page 12: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

A software development department may apply two alternative metrics for calculation of code error density: CED and WCED.

The unit has to determine indicators for unacceptable software quality:

CED > 2 and WCED > 4

12

Page 13: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

13

Code Name Calculation formula

ASCE Average Severity of Code Errors

WCEASCE = -----------

NCE

ASDE Average Severity of Development Errors

WDEASDE = -----------

NDE

NCE = The number of code errors detected by code inspections and testing.NDE = total number of development (design and code) errors detected in the development process.WCE = weighted total code errors detected by code inspections and testing.WDE = total weighted development (design and code) errors detected in development process.

Page 14: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

14

Code Name Calculation formula

TTO Time Table Observance MSOTTTO = -----------

MS

ADMC Average Delay of Milestone Completion

TCDAMADMC = -----------

MS

MSOT = Milestones completed on time.MS = Total number of milestones.TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.

Page 15: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

15

Code Name Calculation formula

DERE Development Errors Removal Effectiveness

NDEDERE = ---------------- NDE + NYF

DWERE Development Weighted Errors Removal Effectiveness

WDEDWERE = ------------------ WDE+WYF

NDE = total number of development (design and code) errors) detected in the development process.WCE = weighted total code errors detected by code inspections and testing.WDE = total weighted development (design and code) errors detected in development process. NYF = number software failures detected during a year of maintenance service. WYF = weighted number of software failures detected during a year of maintenance service.

Page 16: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

16

Code Name Calculation formula

DevP Development Productivity DevHDevP = ---------- KLOC

FDevP Function point Development Productivity

DevHFDevP = ----------

NFP

CRe Code Reuse ReKLOCCre = --------------

KLOC

DocRe Documentation Reuse ReDocDocRe = ----------- NDoc

DevH = Total working hours invested in the development of the software system.ReKLOC = Number of thousands of reused lines of code.ReDoc = Number of reused pages of documentation.NDoc = Number of pages of documentation.

Page 17: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Product Metrics

17

Page 18: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

18

* HD quality metrics:

* HD calls density metrics - measured by the number of calls. * HD calls severity metrics - the severity of the HD issues raised. * HD success metrics – the level of success in responding to HD calls.

* HD productivity metrics.

* HD effectiveness metrics.

* Corrective maintenance quality metrics.* Software system failures density metrics * Software system failures severity metrics * Failures of maintenance services metrics * Software system availability metrics

* Corrective maintenance productivity and effectiveness metrics.

Page 19: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

HDD HD calls density NHYCHDD = -------------- KLMC

WHDD Weighted HD calls density WHYCWHYC = ------------ KLMC

WHDF Weighted HD calls per function point

WHYCWHDF = ------------ NMFP

NHYC = the number of HD calls during a year of service.KLMC = Thousands of lines of maintained software code.WHYC = weighted HD calls received during one year of service.NMFP = number of function points to be maintained.

Page 20: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

ASHC Average severity of HD calls WHYCASHC = --------------

NHYC

NHYC = the number of HD calls during a year of service.

WHYC = weighted HD calls received during one year of service.

Page 21: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

HDS HD service success NHYOTHDS = -------------- NHYC

NHYNOT = Number of yearly HD calls completed on time during one year of service. NHYC = the number of HD calls during a year of service.

Page 22: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

HDP HD Productivity HDYHHDP= -------------- KLNC

FHDP Function Point HD Productivity HDYHFHDP = ---------- NMFP

HDE HD effectiveness HDYHHDE = -------------- NHYC

HDYH = Total yearly working hours invested in HD servicing of the software system.KLMC = Thousands of lines of maintained software code.NMFP = number of function points to be maintained.NHYC = the number of HD calls during a year of service.

Page 23: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

SSFD Software System Failure Density NYF SSFD = --------------

KLMC

WSSFD Weighted Software System Failure Density

WYFWFFFD = ---------

KLMC

WSSFF Weighted Software System Failures per Function point

WYFWSSFF = ---------- NMFP

NYF = number of software failures detected during a year of maintenance service.WYF = weighted number of yearly software failures detected during one year of maintenance service.NMFP = number of function points designated for the maintained software.KLMC = Thousands of lines of maintained software code.

Page 24: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

ASSSF Average Severity of Software System Failures

WYFASSSF = --------------

NYF

NYF = number of software failures detected during a year of maintenance service.WYF = weighted number of yearly software failures detected during one year.

Page 25: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

MRepF Maintenance Repeated repair Failure metric -

RepYFMRepF = --------------

NYF

        NYF = number of software failures detected during a year of maintenance service. RepYF = Number of repeated software failure calls (service failures).

Page 26: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

FA Full Availability NYSerH - NYFHFA = -----------------------

NYSerH

VitA Vital Availability NYSerH - NYVitFHVitA = -----------------------------

NYSerH

TUA Total Unavailability NYTFHTUA = ------------ NYSerH

 NYSerH = Number of hours software system is in service during one year.   NYFH = Number of hours where at least one function is unavailable (failed) during one year, including total failure of the software system. NYVitFH = Number of hours when at least one vital function is unavailable (failed) during one year, including total failure of the software system. NYTFH = Number of hours of total failure (all system functions failed) during one year.NYFH ≥ NYVitFH ≥ NYTFH.1 – TUA ≥ VitA ≥FA

Page 27: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Code Name Calculation Formula

CMaiP Corrective Maintenance Productivity

CMaiYHCMaiP = ---------------

KLMC

FCMP Function point Corrective Maintenance Productivity

CMaiYHFCMP = --------------

NMFP

CMaiE Corrective Maintenance Effectiveness

CMaiYHCMaiE = ------------

NYF

CMaiYH = Total yearly working hours invested in the corrective maintenance of the software system. NYF = number of software failures detected during a year of maintenance service. NMFP = number of function points designated for the maintained software. KLMC = Thousands of lines of maintained software code.

Page 28: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Defining new Software Quality Metrics

28

Page 29: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2
Page 30: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Limitations of Software Quality Metrics

30

Page 31: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

    * Budget constraints in allocating the necessary resources.

    * Human factors, especially opposition of employees to evaluation of their activities.

* Validity Uncertainty regarding the data's, partial and biased reporting.

Page 32: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

* Parameters used in development process

metrics: KLOC, NDE, NCE.

* Parameters used in product (maintenance) metrics: KLMC, NHYC, NYF.

Page 33: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

a. Programming style (KLOC).b. Volume of documentation comments (KLOC).c. Software complexity (KLOC, NCE).d. Percentage of reused code (NDE, NCE).e. Professionalism and thoroughness of design review and

software testing teams: affects the number of defects detected (NCE).

f. Reporting style of the review and testing results: concise reports vs. comprehensive reports (NDE, NCE).

Page 34: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

a. Quality of installed software and its documentation (NYF, NHYC).

b. Programming style and volume of documentation comments included in the code be maintained (KLMC).

c. Software complexity (NYF).d. Percentage of reused code (NYF).e. Number of installations, size of the user population and

level of applications in use: (NHYC, NYF).

Page 35: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

Goal Question Metrics Approach“GQM” Approach

35

Page 36: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

The GQM approach is based on the idea that in order to measure in a meaningful way we must measure that which will help us to assess and meet our organizational goals.

– A Goal is defined for an object, for a variety of reasons, with respect to various models of quality, from various points of view, relative to a particular environment. The objects of measurement are products, processes and resources.

– Questions are used to characterize the way the assessment/achievement of a specific goal is going to be performed based on some characterizing model. Questions try to characterize the object of measurement with respect to a selected quality issue and to determine its quality from the selected viewpoint. Questions must be answerable in a quantitative manner.

– Metrics are associated with every question in order to answer it in a quantitative way.

36

Page 37: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

– Goal: Customer Satisfaction

– Question: Are customers dissatisfied when problems occur?

– Metric: Number of customer defect reports

37

Page 38: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

– Goal: To improve the outcome of design inspections from the quality managers point of view.

– Question: What is the current yield of design inspections?– Metric: inspection yield = nr defects found / est. total

defects

– Question: What is the current inspection rate?– Metric: individual inspection rate, inspection period,

overall inspection rate

– Question: Is design inspection yield improving?– Metric: current design inspection yield / baseline design

inspection yield * 10038

Page 39: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

organization’s goals: is to evaluate the effectiveness of the coding standard.

Some of the associated questions might be:

◦ Who is using the standard?

◦ What is coder productivity?

◦ What is the code quality?

For these questions, the corresponding metrics might be:

◦ What proportion of coders are using the standard?◦ How have the number of LOC or FPs generated per day per coder

changed?◦ How have appropriate measures of quality for the code changed?

(Appropriate measures might be errors found per LOC or cyclomatic complexity.)

39

Page 40: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2

WILLIAW E. LEWIS, “SOFTWARE TESTING AND CONTINUOUS QUALITY IMPROVEMENT”, THIRD EDITION, CRC PRESS, 2009.

K. NAIK AND P. TRIPATHY: “SOFTWARE TESTING AND QUALITY ASSURANCE”, WILEY, 2008.

IAN SOMMERVILLE, SOFTWARE ENGINEERING, 8TH EDITION, 2006.

ADITYA P. MATHUR,“FOUNDATIONS OF SOFTWARE TESTING”, PEARSON EDUCATION, 2009.

D. GALIN, “SOFTWARE QUALITY ASSURANCE: FROM THEORY TO IMPLEMENTATION”, PEARSON EDUCATION, 2004

DAVID GUSTAFSON, “THEORY AND PROBLEMS OF SOFTWARE ENGINEERING”, Schaum’s Outline Series, McGRAW-HILL, 2002.

40

Page 41: Prof. Mohamed Batouche batouche@ksu.edu.sa. “If you can’t measure it, you can’t manage it” Tom DeMarco, 1982 2